Skip to content

Home Generative artificial intelligence (AI) in assessments

Generative artificial intelligence (AI) in assessments

What is generative AI?

Generative artificial intelligence is a label used to describe any type of artificial intelligence (AI) that is used to create text, prose, formulae, code, images, video or audio.

AI outputs can be very human-like, potentially increasing the risk of plagiarism.

AI in session 2023-24

Our current position on the use of generative AI in session 2023-24 has been developed to ensure equity and fairness for all learners studying our qualifications.

Learners cannot submit AI outputs as their own work

Learners are not permitted to use generative AI tools to create outputs – for example text, prose, formulae, code, images, video, audio – that they then submit as their own work for assessment tasks that contribute towards an SQA qualification. These tasks include: exams, unit assessments, coursework, and portfolios. Doing so would constitute plagiarism and could result in awards being cancelled.

AI cannot be referenced as a source

Learners must not include outputs from generative AI tools that are referenced as a source for assessment tasks that contribute towards an SQA qualification. There are currently some significant issues regarding the reliability and validity of these outputs that mean referencing the tools could be inappropriate or disadvantageous to learners.

Rationale for our current position on AI


Learners studying towards SQA qualifications should use valid, reliable and authoritative sources of reference to support their work.

There is evidence that outputs from these tools can be biased, incorrect, and can fabricate information. Outputs can also be inconsistent, even when using the same prompts – making it difficult for assessors to authenticate the sources. For these reasons, outputs from generative AI tools are not currently considered valid or reliable.

Using outputs from generative AI tools as sources may not meet the referencing requirements of specific courses and could impact the number of marks a learner can achieve. For example, some SQA qualifications require sources to be recent and text output from generative AI tools can be difficult, or impossible, to date.

Age restrictions

An important factor, which could impact equity and fairness for learners, is the age restriction that the creators of generative AI tools have placed on their products.

Guidance for centres: authenticating learners’ work

With the availability of AI chatbots, which can quickly produce human-like text, it is important that you are aware of the appropriate authentication steps to take.

We have produced a guide to support centres in ensuring learners’ work is their own:

Authenticating learners’ work – good practice advice for centre staff

Future use of AI

As generative AI advances, the barriers and the current flaws within this developing technology can be overcome. We understand the need to embrace the opportunities that new technology offers and will continue to keep our position under review.

The advances in generative AI hold the possibility of opportunities for education and assessment. We must explore opportunities while balancing the need to mitigate risks to the integrity of SQA qualifications and assessments, as well as ensuring equity and fairness for all learners.

SQA will continue its work on the use of generative AI tools in the assessment context. We will publish further guidance where appropriate.