How can middle leaders use CEM assessment data effectively?

By

At Evidence Based Education, we’ve trained thousands of teachers, across the UK and internationally, in how they can use CEM assessment data effectively in school. Many of the schools that we work with have identified that various stakeholders in their school will all use CEM data slightly differently. This could be a classroom teacher using baseline data to inform their teaching to senior leaders using value-added data to monitor progress. Recently, we spoke to Andrew Kyle, Head of Humanities at Chetham’s School of Music in Manchester, about his experience of using CEM assessment data and how, in his role as a middle leader, he facilitated the use of CEM assessment data with colleagues. His thoughts were so insightful that we’d like to share some of them here.

A fundamental to using the data effectively is ensuring that there is a basic level of trust in the data. It is a journey for staff to understand that the data can have a great deal of formative value. Typically, we find that, when teachers are first introduced to CEM assessment data, they tend to focus on the outcomes such as predicted grades. This can then lead to a lack of trust in the data as students often do better or worse than the predictions suggest.

Ensuring that teachers understand the value of the data, as well as the limitations, is important for increasing trust amongst staff. For example, understanding more about likely outcomes for students by looking at trends in chances graphs rather than simply looking at a single predicted grade. Emphasising that CEM assessment data is a supplement to teacher professional judgement, not a replacement, is an important first step in ensuring they are used sensibly and appropriately.

At EBE, we created our ongoing support packages (more information can be found here) with this in mind. Persistence and patience are often required and one INSET on its own is rarely enough to instil faith in the data; but revisiting the themes can help.

CEM assessment data can also be useful for middle leaders in strategic planning. Looking at a new cohort’s intake profile helps heads of department to identify areas which need to be focused on, for example vocabulary of EAL students or high ability students who can be stretched, and, subsequently, where resources can best be deployed. Intake profiles of new cohorts can vary considerably from year-to-year and having this data to hand early on gives teachers an insight into the new students.

Middle leaders also often play a crucial role in using value-added feedback formatively. These data are not only useful for looking at pupil progress retrospectively; but can be used for strategic planning. An example of this that Andrew highlighted was a head of Science observing that the value-added results for A-level physicists were polarised with students who were not studying A-level maths having lower value-added outcomes compared to their peers taking maths alongside physics. An obvious connection perhaps, but he then successfully used this evidence to argue for the reinstatement of a 1 hour “Maths for A-level Physics” class.

When used appropriately, value-added feedback can also provide a powerful perspective on student achievement which might not be immediately evident from raw attainment data such as the percentage of A* and A grades or the number of passes.  Middle leaders can use these data as a motivational tool for hardworking and dedicated colleagues who can often be deflated by exam results. We’ve heard comments such as ‘I wish there had been more A grades’, or ‘my students didn’t do as well as the students in the other set’. Being able to show colleagues that, actually, most of their students met or exceeded their potential can help to keep them motivated and give them a sense of fulfilment.

We’ve also found that the way the CEM assessment data are distributed to staff is critical in their effective use. When downloaded, en masse, the sheer quantity of data can cause confusion. Sharing key aspects in a targeted manner with staff can help them use the data more effectively; for example, Individual Pupil Reports may be particularly helpful for SEN departments, whereas chances graphs may be more appropriate for class teachers.

We have created an extensive library of training videos for the CEM assessments; a table of contents is available here. Not only do we explain how the data is generated and go through the different feedback for each assessment, we also provide targeted videos for different stakeholders explaining how they may want to use the data. These short videos, often between 5 and 15 minutes, are provided as part of our ongoing support packages and are available for staff to view for ‘quick-hit’ refresher training.

 

Matt McGinlay is EBE’s CEM Training Manager. We’re partnering with Chetham’s to host a training conference on interpretation of CEM assessment data in Manchester on 3rd May; find out more and book tickets here.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0
X
X