Does anyone know how to improve assessment?

 In Header, News, What've we been up to?

Anyone who works with or for EBE has to accept one thing: we are boundary-dwellers. No, this doesn’t refer to some kind of itinerancy or professional indecision, rather it describes our residence in the interconnected margins of education research, policy and practice. We are researchers but we don’t just do research; we are qualified teachers but we don’t have school-based jobs; we work with policy-makers but we don’t make policy. In one small organisation, you’ll find the capacity to bring credible guidance, training and support to everyone from newly-qualified teachers, middle leaders, senior leaders, directors, chief executives, professors, ministers and parents. Joining these people together in shared conversations around evidence and how we can use it to improve outcomes for children is what we do.

Dwelling in these boundaries, we are brokers and mobilisers of both knowledge and people, and nowhere is this more evident than in our Assessment Academy project. Put yourselves in our position in March 2016 and ask: “How would I reduce the uncertainty pervading the mind of school leader and teacher navigating new curricula, new qualifications, and shrinking budgets, at the same time as improving outcomes for students?”

This was the question we found ourselves asking and what came to the fore, through conversations with our Advisory Board Member Prof. Rob Coe, was that assessment remained an unclaimed prize of learning, and that increasing teachers’ capacity to assess better is important. So we set to work.

Anchored in the best available evidence

Assessment Academy was created with a set of anchor points:

  • The Department for Education’s Standard for teachers’ professional development (2016)
  • Recommendations from The Carter Review (2015)
  • Recommendations from the Commission on Assessment Without Levels (2015)
  • Timperley, Wilson et al. (2007)
  • Higgins, Cordingley et al. (2015)

If you look through the Assessment Academy course (in both its in-person and online forms) you’ll see our operationalised responses to all of these: evidence-based professional development. But it’s more than that; we’ve developed it in collaboration with teachers and senior leaders, taking our ideas to them, listening to their ideas, and developing something we hope helps teachers to assess better and improve outcomes for students. But hope, on its own, is not enough.

Accessible, online training, but not as we know it

While we’re proud of our Assessment Academy developments so far, we don’t know if they have the kind of impact we want. With support from our partner organisation Cambridge Assessment, Assessment Academy Online is being constructed like no other teacher professional development offering: by bringing in designers and behavioural scientists, and building ongoing evaluation into the platform, we will be able to test, learn and adapt the experience of Assessment Leads in-training. Just by taking part in Assessment Academy Online, teachers will be making it better for other teachers.

Sally Brown from the Cambridge Assessment Network said: “As providers of assessment training, Cambridge Assessment Network welcomes the renewed focus on assessment found in the Carter Review, Assessment Without Levels and Stephen Munday’s recent report on initial teacher training. The idea that ‘trainees should be fully conversant with the fundamental principles of assessment and testing’ is at the heart of what we do.”

A wing to heaven?

Assessment is a hot topic in education at the moment, and not least for those teachers and leaders currently coming towards the end of the Assessment Academy Pilot. These 23 volunteers have gone through three out of four challenging days’ training to understand theories of assessment, design robust assessments, and analyse assessment data to ascertain its reliability and to shed light on the validity of the judgements based on them. Why did they sign up?

“With the massive changes to assessment and the introduction of life without levels, I felt that this course would be beneficial” was one Assessment Lead’s response. Another said that they wanted “to improve [their] knowledge of how to design valid assessments in order to ensure [they] assess students accurately”. Most pilot participants say that the course has been useful to them, with many reporting increased confidence to design more accurate and reliable assessments. We’ve been working with those who have not found the course as useful as they want to improve our work as we go.

A hard road

It’s not been easy for the Assessment Leads, but their commitment, energy and desire to improve their work for the benefit of their students has been humbling. They’ve found time to do the tasks we set, been critical of their own assessment practice, and worked hard to understand technical aspects of data analysis. These are people whose professional lives are plagued by more uncertainty now than ever before, and in a way that nobody would wish on individuals with such important roles to play.

What can they do now that they couldn’t before?

“I have become much better at understanding different factors which impact on assessment results. I have also gained a greater understanding of the different factors which can impact on validity.”

“I think my previous understanding was more superficial than I realised. My depth of understanding has increased hugely.”

We’re not done yet

Assessment Academy is our attempt to give them the tools to help their colleagues gain a more secure understanding of what their students know and can do; we feel proud to be associated with the first ever group of Assessment Leads, but this is only the beginning.

Find out more about Assessment Academy and discover our “What makes great assessment?” ebook here.

Have your say

Start typing and press Enter to search