Guidance and support for the use of AI in the Extended Project Qualification
16 February 2024
Lisa Winnington, Subject Support Coordinator
The rapid and ongoing advances in generative artificial intelligence (AI) tools bring both benefits and challenges to education and assessment. In this blog, we highlight the guidance available for managing AI use in the Extended Project Qualification (EPQ) and how to deal with misuse in assessments.
What is AI?
AI tools generate text or images in response to user prompts and questions. The responses of AI tools are based upon the data sets upon which they have been trained. ChatGPT is the best-known example of an AI chatbot, but many other chatbots and tools are available.
The most useful AI tools for both Creative iMedia and IT qualifications are those that generate text and images. This includes AI such as Microsoft copilot and AI that is integrated into software such as Adobe Firefly.
Appropriate use of AI
Whether the use of AI by students is appropriate for a given task will depend on the marking criteria and nature of the task.
It is important that any use of AI to help with initial ideas or research should be referenced. Students need to be made aware of this from the outset.
Inappropriate use of AI
Like plagiarism, AI can be used by students to create work which they then try to pass off as their own work. Where a student has used AI to complete all or some of their work, they are not demonstrating their own knowledge, understanding and application of skills. This may prevent the candidate from presenting their own authentic evidence.
Examples of AI misuse include using or modifying AI responses without acknowledgement, disguising the use of AI, or using it for substantial sections of work. You can support your students by teaching them about appropriate use of AI in EPQ, demonstrating how to reference AI correctly where its use is appropriate, and having clear policies for AI use within your department.
If the student has used AI to provide evidence directly assessed in the marking criteria, this cannot be awarded marks. This will also prevent the student from providing their own authentic evidence towards the marking criteria.
What to do when candidates misuse AI in assessments
Teachers must not accept work which is not the student’s own. Ultimately the Head of Centre has the responsibility for ensuring that students do not submit inauthentic work.
If you suspect AI misuse and the student has not signed the declaration of authentication, your centre doesn’t need to report the malpractice to OCR. You can resolve the matter prior to the signing of the declarations.
If AI misuse is suspected after formal submission and signing of the authentication sheet, AI concerns within candidate work should be reported with a JCQ M1 form, as outlined in the JCQ AI guidance, available on the Malpractice section of the JCQ website. Please email your completed forms to OCR at malpractice@ocr.org.uk.
Further support
Please refer to the JCQ AI use in assessments: Protecting the integrity of assessment document for further information on managing the use of AI within your assessments.
We also have a range of support resources, included recorded webinars, on our AI support page.
Stay connected
Share your thoughts in the comments below. If you have any questions, you can email us at general.qualifications@ocr.org.uk, call us on 01223 553998 or tweet us @ocrexams. You can also sign up to receive subject updates information about resources and support.
About the author
Lisa is a Subject Support Co-ordinator and has worked for Cambridge University Press & Assessment in various roles since 2000, most recently in Compliance. Lisa is responsible for a range of subjects including law, citizenship and the EPQ.
In her spare time, you’ll find her studying to be a Naturopathic Nutritionist, running or spending time with the family puppy.
Related blogs