Guidance and support for the use of AI in Cambridge Nationals in Creative iMedia and IT
09 February 2024
Debbie Williams, Computer Science, IT and Creative iMedia Subject Advisor
The rapid and ongoing advances in generative artificial intelligence (AI) tools bring both benefits and challenges to education and assessment. In this blog, we highlight the guidance available for managing AI use in health and social care and child development and how to deal with misuse in assessments.
What is AI?
AI tools generate text or images in response to user prompts and questions. The responses of AI tools are based upon the data sets upon which they have been trained. ChatGPT is the best-known example of an AI chatbot, but many other chatbots and tools are available.
The most useful AI tools for both Creative iMedia and IT qualifications are those that generate text and images. This includes AI such as Microsoft copilot and AI that is integrated into software such as Adobe Firefly.
Appropriate use of AI
Whether the use of AI by students is appropriate for a given task will depend on the marking criteria and nature of the task.
For Creative iMedia and IT, students can generally use AI for any parts of their NEA where there are no marks awarded. For example, creating assets and getting help with something that they then need to apply to the scenario independently.
In Creative iMedia R094, students need to source assets and prepare them for use in their digital graphical product. There are no marks awarded for collecting these assets. Students could collect these assets using a search engine. Alternatively, they could use AI to create assets for them.
Students need to then decide the appropriateness of these images and how easy they are to repurpose.
However, using AI to edit the images could prevent them getting marks for the preparation of assets. It is also worth noting that AI images may be difficult to edit. Therefore, students may be better off using images found online using search engines.
In IT (R060), students create a spreadsheet solution for a client. AI could be used to provide support. For example, they could ask AI “how do I use Excel to count the number of times something occurs in a range of cells?”, or “How do I use the ‘COUNT’ function in Excel?”.
AI would give feedback on using this specific tool. They could also get support from looking back at class notes, previous spreadsheet tasks, a textbook or online. This is seen as appropriate use of AI. This is because the student still needs to interpret the guidance the AI tool has given, and apply this guidance to their solution etc.
Inappropriate use of AI
AI can be used by students to create work which they then try to pass off as their own work. This is plagiarism.
Where a student has used AI to complete all or some of their work, they are not demonstrating their own knowledge, understanding and application of skills. This prevents the candidate from presenting their own authentic evidence. They will likely limit the range of marks they can then access.
Examples of AI misuse include:
- using or modifying AI responses without acknowledgement
- disguising the use of AI
- using AI for substantial sections of work.
You can support your students by teaching them about appropriate use of AI in Creative iMedia and IT by:
- demonstrating how to reference AI correctly
- showing areas of the NEA where its use is appropriate
- having clear policies for AI use within your department.
If we use the same example as before in Creative iMedia R094, students need to create a digital graphical product. Students get marks for using tools and techniques and for how the finished product layout and meets conventions.
There is the opportunity for students to use AI to create a finished product. If students choose AI to create the whole product, then this is inappropriate use of AI. Candidates should not be awarded marks for this. AI created products are fairly easy to spot, and teachers should be familiar in how to supervise candidates and also in spotting the more obvious AI creations.
For IT, students need to create an evaluation for R060, or a review for R070. AI tools are able to create evaluations. All the marks in the mark scheme are for what the candidate writes independently. Therefore, candidates should not gain credit in this strand if you suspect AI has been used.
It should be easy to spot AI use as the output from AI tools are often very generic and do not relate directly to the student’s solution. There may also be American spellings in the text. The writing style will not reflect the student’s usual style and finally, AI often comments on its own ability which students may not remove.
What to do when candidates misuse AI in assessments
Teachers must only accept student’s own authentic work. Ultimately the Head of Centre has the responsibility for ensuring that students do not submit inauthentic work.
If you suspect AI misuse and the student has not signed the declaration of authentication, your centre doesn’t need to report the malpractice to OCR. You can resolve the matter before the signing of the declarations.
AI concerns within candidate work should be reported with a JCQ M1 form, as outlined in the JCQ AI guidance, available on the Malpractice section of the JCQ website. Please email your completed forms to OCR at malpractice@ocr.org.uk.
Frequently asked questions
Can AI be used to create assets for Creative iMedia NEA tasks?
Yes. Students can use AI to create assets for use in their Creative iMedia NEA tasks. They must reference them correctly in their assets table and follow the JCQ guidance:
“Where AI tools have been used as a source of information, a student’s acknowledgement must show the name of the AI source used and should show the date the content was generated. For example: ChatGPT 3.5 (https://openai.com/ blog/chatgpt/), 25/01/2024. The student must retain a copy of the question(s) and computer-generated content for reference and authentication purposes, in a noneditable format (such as a screenshot) and provide a brief explanation of how it has been used.”
Further support
Please refer to the JCQ AI use in assessments: Protecting the integrity of assessment document for further information on managing the use of AI within your assessments.
We also have a range of support resources, included recorded webinars, on our AI support page.
Stay connected
Share your thoughts in the comments below. If you have any questions, you can email us at ocr.vqit@ocr.org.uk, call us on 01223 553998 or tweet us @OCR_ICT. You can also sign up to receive subject updates information about resources and support.
About the author
Debbie joined the computing team in September 2022, bringing her knowledge as a teacher and subject leader for IT, Computing and Creative Media. She has over 20 years’ experience of education working in various settings including state schools, private specialist provision, local authority, and as a marker and moderator for exam boards. She has a degree in Technology Management, a PGCE and a Masters in Teaching and Learning.
Related blogs