Can we have better descriptions of performance in our examinations?

18/03/2014Dr Chris Wheadon, Director, No More Marking Ltd.

20… 19… 18… As the British Military fitness instructor counted down my press-ups from 20 to 1, I wondered why on earth I was doing press-ups in the mud early one Saturday morning. His answer was simple: to get better at doing press-ups. Then he gave me another 20 for being cheeky.

We’ve all had this experience as teachers. Why do we have to do this sir? Is it in the test? If the answer is yes, then you can get them to do it sitting upside down in a swimming pool, if that’s how the test will be run. Because by doing whatever it is, they will get better at it, and will do better in the test. We do press-ups to get better at doing press-ups. We do tests to get better at doing tests.

Some users of tests, however, seem to want to know more than our final score or grade or, in my case, the number of press-ups I can do. What does a maths test mean in terms of how good I am at mathematics? Or at doing the calculations I need to design a bridge that won’t collapse? In my case, what does my ability to do press-ups mean in terms of my ability to haul shopping bags from supermarket to home? Already, at this point in my thinking, I can hear my fitness instructor: ‘If you want to get better at hauling shopping bags, start hauling shopping bags…’

International and national assessments tend to begin with grand statements that set up certain expectations. It seems they all aspire to tell us what students know and can do. PISA, the NAEP, GCSEs, all of them aspire to this. Well that is great news! As an employer, can you tell me at what PISA score students will be able to write a letter to a customer inviting him to participate in a research study? As I skip through the first of PISA’s latest 555 page report, I quickly realise I’m in trouble. PISA appears to be able to tell me about mathematics and reading, but not about writing. Not to mention the fact that kids don’t get a score, they get a ‘plausible value.’

Undeterred, I turn to the GCSE. Within 5 minutes I have pulled up the grade descriptors for AQA’s GCSE English grade C:

Candidates’ writing shows successful adaptation of form and style to different tasks and for various purposes. They use a range of sentence structures and varied vocabulary to create different effects and engage the reader’s interest. Paragraphing is used effectively to make the sequence of events or development of ideas coherent and clear to the reader. Sentence structures are varied and sometimes bold; punctuation and spelling are accurate.

Wow! And that is just at grade C! Adaptation! Engaging writing! Bold sentence structure! Imagine what you get at grade A! So if I employ a young person with a grade C in English, can I really expect accurate spelling and punctuation? Before you dismiss such descriptions as some form of dumbing down conspiracy, ask around any English teachers you know. A good teacher will confidently reel off the assessment objectives of the GCSE and A-level syllabus, and will tell you the relative strengths and weaknesses of a piece of writing in terms of those assessment objectives, and in terms of the grades you can expect. At some point they will say, the punctuation and spelling are not grade C level…

So, given the wealth of descriptive data we have around examinations – the criteria, the levels, the grade descriptors, the domains, the constructs – do we really need any more detail? Personally I would add very little. Firstly, returning to the grade descriptor I would simply change the first sentence:

Candidates’ writing against the GCSE English task we set them under examination conditions shows successful adaptation of form and style to different tasks and for various purposes.

I would add this qualification because I know that the candidates will have been prepared for the task in hand, have memorised strategies, mark schemes and model answers, and there is little that can be done to improve how much further we can generalise from their answers. Unfortunately, however good we get at tests, our performance is limited to some extent by the conditions under which those tests are taken.

Secondly, I would make some examination scripts available to all after the examinations. I have a feeling that we may disagree about what constitutes bold and engaging writing. Disagreement on standards is a healthy and necessary debate, which is made healthier by the presence of a good sample of evidence.

Anyway, I must get back to my press-ups. I’ve got a test coming up. Of press-ups. Just don’t ask me to haul any shopping bags, that’s not what I’m working on right now.

Dr Chris Wheadon is Director of No More Marking Ltd. (www.nomoremarking.com)