Using AI assitance | Delivery Support | NCFE

What can we help you find?

Using AI to assist marking and feedback

As artificial intelligence (AI) tools become more accessible, many educators are exploring how they can support marking and feedback. But how can you use AI effectively, ethically, and in line with current expectations? 

Our new guidance offers practical advice on how to integrate AI into your marking processes. This guide is designed to help you reduce workload, improve efficiency, and maintain high standards in teaching, learning and assessment. 

What’s inside the guide? 

Using AI with Confidence is a practical guide designed to help educators and centre leaders make informed, responsible decisions about using AI in marking and feedback. It sets out clear requirements and examples of best practice for using AI in centre-marked assessments, helping you stay compliant and consistent across your organisation. 

It also aligns with current guidance from JCQ, Ofqual and the Department for Education, so you can be confident your approach meets national expectations. 

The guide offers practical support for addressing four key challenges: 

  • Accuracy and quality – how to ensure AI supports human judgement and maintains high standards. 
  • Privacy – how to protect learner data and comply with UK data protection law. 
  • Transparency – how to build trust with learners and carers by being open about how AI is used and how decisions are made. 
  • Knowing your learners – how to preserve the essential connection between teachers and learners and ensure feedback remains meaningful and personalised. 

Whether you're just beginning to explore AI or looking to refine your current approach, this guide will help you feel equipped, confident and ready to use AI tools effectively and ethically in your assessment practice. 

Download the guidance

Download our new guidance

Using AI with confidence: A guide to the marking and feedback of summative assessment in centres

Download now

Frequently asked questions (FAQs)

Assessors remain fully accountable for the marks and feedback they give, even when they use AI tools to assist them. Centres must ensure that assessors have clear accountability for final marks and feedback. The requirement for assessors to review and critically evaluate AI outputs ensures that they retain control and uphold quality. 

The guidance emphasises that AI must support – not replace – human judgement, in-line with Ofqual’s and JCQ’s current stance. To ensure this supportive role, assessors must review all submissions in full and critically evaluate AI outputs for factual accuracy, bias, and alignment with assessment criteria.

Centres should also ensure transparency by informing learners when AI is used to assist marking or feedback, as this builds trust and aligns with NCFE’s guidance on AI misuse. 

Absolutely not! We want to support centres to have the best marking and feedback practices possible for the benefit of learners, however they design and implement this activity in their centre. 

Centres do not need permission to use AI assistance for marking learner work, but they must inform NCFE through their communications with their EQA. Permission is only required if NCFE intellectual property will be ingested into an AI model. 

Input from Ofqual, JCQ, and DfE on this subject makes it clear that centres are allowed to use AI assistance to mark learner work, however up until now there hasn't been a clear set of instructions about how to do this effectively and safely. This guidance fills that gap, providing NCFE’s customers with a clear set of rules about what they need to do initially before rolling out AI assistance to mark learner work, and how they can then build on that work with best practice ideas to continue to improve it. 

 

No, this guidance only applies to centre-based marking of summative assessments on NCFE qualifications. Decisions about if an dhow to use AI assistance to mark formative assessments sit with centres, and NCFE examiners do not use AI assistance to mark external assessments.

No. Intellectual property (IP) concerns relate to AI model training, not to storing learner work for plagiarism detection or comparison with future submissions. Plagiarism checks do not create additional IP issues. 

No, because there are too many variations in how centres might implement AI-assisted marking (such as build, buy, or DIY solutions). However, the elements listed as “required” in NCFE’s guidance are a good starting point for a checklist. 

Yes. Centres must seek permission before allowing NCFE materials to be ingested into an AI model for marking or any other purpose. This ensures NCFE can set conditions of use and maintain quality and currency of materials. Please note that some content may also be subject to third-party copyright (such as the Department for Education). 

Yes, learners may use AI as an assistant to review their work and suggest improvements, provided they make the changes themselves and do not copy and paste AI-generated text. This is not permitted if the learning outcomes being assessed explicitly include the learner’s ability to perform the same task (such as reviewing and improving text). 

Not at present. NCFE is exploring and testing AI applications to build trust and evidence of what works, but we are not yet at the stage of publishing models, tools, or new products.