GCSE Science exams - what do students think about features intended to ensure accessibility
22 November 2019
In this blog post, we describe a research study conducted in Spring 2019 to explore students’ views on features of exam questions thought to make questions easier to understand.
We would like to thank the teachers and students who took part in the research for their time and enthusiasm.
Why does accessibility in exams matter?
Over several decades, there has been a shift towards making many tests and assessments appropriate for a wider range of students. For example, when GCSEs replaced O levels in the late 1980s, the aim was that they would be appropriate for a wider ability range.
For some time, awarding bodies have been required to consider the needs of different students when designing assessments and to ensure that all students can access the exam questions. This is important, of course, as we need students to be able to understand a task in order for them to have a fair chance to show us their knowledge, understanding and skills in the subject.
OCR’s work on accessibility
As part of continued efforts to ensure that all students can understand the questions they are being asked, OCR has developed a set of accessibility principles for GCSE Gateway Science and Twenty First Century Science exams (see pages 5-7) and applied these starting with the June 2018 exam papers. They have also been applied to the sample assessment materials and practice papers.
The principles draw on a literature review of the effects of visual features on test accessibility and past research on how features of questions affect students’ thinking and answers. With the principles now in use in question papers, OCR asked us to gather views from students.
What research method was used?
We selected eight science questions from the June 2018 exam papers. For each question there was a version with certain accessibility principles applied and a version without.
The different versions of the questions were used to create two versions of a short test. Each test contained some question versions expected to be more accessible and some question versions expected to be less accessible.
To illustrate, the two versions of one of the questions are shown in the image below:
- The version on the left shows the question before accessibility principles were applied
- The version on the right shows the question after accessibility principles were applied.
We visited four schools and asked some Year 11 students to take the test. We interviewed 57 students afterwards, usually in pairs.
The interviews covered whether the students:
- found the questions easy to understand
- felt that one version of a question was easier to understand than the other with regard to certain features.
What did students think?
For most of the accessibility principles explored, we found that student views on the ease of understanding questions were in line with expectations about effects on accessibility. For example, versions of questions with simpler vocabulary or simpler grammatical structures were more frequently preferred by students in terms of ease of understanding.
That said, in some cases, there were also fairly high numbers of students who felt that language differences between versions of questions did not affect the ease of understanding. This may suggest that these changes were helpful to some students (perhaps those with slightly weaker language skills), but were less necessary for others.
In these cases, there is still a strong argument for implementing such principles in order to reduce risks that language skills negatively affect performance for some students (where it is not the intention to assess language skills).
Other findings where student views were in line with the accessibility principles included that:
- Bullet points were considered useful when setting out the steps in an experiment..
- Putting multiple choice answer options in numerical order was helpful
- Showing units in brackets (e.g. in a table) was easier to understand.
For a small number of accessibility principles, findings were neutral or less conclusive. For example, removing a non-essential visual resource (or part of one) had varying effects on perceived accessibility. Whilst the effects for visuals were mixed, other evidence (Crisp & Sweiry, 2006; Kettler et al., 2012) supports the idea that it is best to avoid visuals that do not provide useful information.
This is probably still a sound principle, in general, but the current findings suggest that decisions around the inclusion of visual resources should be made on a case-by-case basis taking into account the nature of the specific visual and how it might potentially support the interpretation of the question. This is consistent with OCR’s current practice.
In conclusion
Students’ views suggest that most of the accessibility principles that we investigated are appropriate and should continue to be applied to improve students’ understanding of future exam questions. We hope to publish the findings from the research in more detail in a future issue of Research Matters.
And once again a big thank you to all the teachers and students who took part in the study.
Stay connected
If you have any questions you can submit your comments below email us at science@ocr.org.uk or sign up to receive email updates or follow us on Twitter at @OCR_ Science
References
Crisp, V. & Sweiry, E. (2006). Can a picture ruin a thousand words? The effects of visual resources in exam questions, Educational Research, 48(2), 139-154.
Kettler, R.J., Dickenson, T.S., Bennett, H.L., Morgan, G.B., Gilmore, J.A. et al. (2012). Enhancing the accessibility of high school science tests: A multistate experiment. Exceptional Children, 79(1), 91-106.
About the author
Dr Victoria Crisp (pictured right)
is a Senior Research Officer at Cambridge Assessment, having joined in 2000. She has a background in psychology and completed her doctoral research on the judgement processes underpinning the assessment of GCSE coursework. Areas of research have included: issues in question difficulty, question design and question writing; the effects of answer spaces on student responses; the use and purposes of annotations in examination marking; validity and validation; comparability issues and methods; and judgement processes in assessment. Victoria has also been involved in providing training for examiners and assessment professionals on issues in writing exam questions.
Dr Sylwia Macinska (pictured left)
is a Research Officer at Cambridge Assessment. Her background is in the field of cognitive psychology, with a special interest in learning and decision making. Sylwia used to work as a specialist student mentor for students with learning disabilities. Her current research activities focus on exploring best practices in education to inform the development of accessible examinations.