AI In Education

AI is changing the way we live, work, and learn. So how does that impact education?

In education, AI can be used to personalize learning, provide feedback, automate (administrative) tasks and more. However, the use of AI in education requires careful consideration of ethical, legal and social implications.

So, What's the deal?

In recent times, generative AI has finally become more available to the public. Large tech companies are rapidly releasing and upgrading their own AI models and systems. Examples include that of OpenAI's ChatGPT and DALL-E image generator, with others like Microsoft and and Google jumping on the band wagon as well. These technologies are being included in other existing, contemporary software as well.

University essays and programming code can be generated with the press of a button, so without a doubt this changes the way students can get to work with information. We keep up regular posts on this website on the current technologies.

How do I incorporate it in my education?

Position of the UT

The University of Twente's perspective is that this technology will only grow further and have more impact. While measures like detection are possible, they will not be the answer to the change in education.

Executive Board UT

We must embrace AI technology carefully and strengthen the human factor in education to adapt and deal with the technology responsibly and ethically.

Executive Board UT

Examination boards have also formulated their recommendations and stance on the use of generative AI.

  • Explanatory note on Academic Misconduct

    “While embracing the technology, misuse of (generative) Artificial Intelligence applications could be considered fraud. The Student Charter describes what the university considers cheating or fraud. Cheating/fraud refers to any action or negligence on the part of a student that precludes an accurate assessment of the student’s knowledge, understanding and skills (Article 6.6 Student Charter).

    Article 6.6 paragraph 1 of the Student Charter states that (any form of) assistance, resources or devices (electronic or technological) other than the ones whose use the examiner or supervisor has permitted prior to the start of the study unit and/or exam or test, or whose use the student knew or ought to have known was not permitted during a test or exam is considered cheating/fraud. Generated Artificial Intelligence programs or applications are considered “assistance, resources or devices” as mentioned in the article referred to above. Consequently, under the current Student Charter, prior permission by examiner or supervisor is needed for the use of generated Artificial Intelligence. A written assignment, project, essay or thesis falls under the umbrella of test or exam.

    Article 6.6 paragraph 4 of the Student Charter states that plagiarism is a particular kind of cheating/fraud, which occurs when the student uses someone else’s work or previous work of their own, without correct referencing. This includes, but is not limited to using parts of another text (printed or digital) without referencing (also if minor changes have been made) or using software without referencing. This means that under the current Student Charter, the use of Artificial Intelligence needs correct referencing.

    The Examination Board decides whether cheating/fraud has occurred. The Examination Board of the educational programme drafts Rules & Regulations on cheating/fraud. These may include additional provisions to the Student Charter and specify what measures will be taken in cases of (suspected) cheating/fraud.”

(Dis)allowing Usage of ai

Allowing or disallowing the different AI tools is very much down to choices in programmes and courses. To promote transparancy, the working group formulated this document to help teachers and students at the University of Twente.


“Are my assessment methods still reliable?” you might ask yourself. Generally speaking, fraudulent use of AI can be done when no supervision is involved. Of course, students are able to prepare for a supervised assessment while using AI themselves. To help you see the potential risk in your type of assessment, this graph was drafted by the Centre of Expertise in Learning & Teaching.

When you want to ensure the reliability and authenticity of your assessment, there are some general guidelines to keep in mind.

  • Mix several types of assessment, especially when you are using unsupervised assessment now.
  • Try out your current assessment in one of the generative AI tools to assess the output quality.
  • Explicitly state your position in assessment, in relation to the usage of AI in education towards your students.
Detecting ChatGPT 3.5

If you specifically want to avoid whether work is (partially) done by ChatGPT 3.5, here are a couple of tips.

  • Text generated by ChatGPT can be overly correct in grammar.
  • Faults or missing information in references and sources
    • To prevent this, you can focus on incorporating relevant sources (since 2021) and own experiences/examples.

As this topic is highly dynamic, some of the provided advice and guidelines might be overruled by new technological developments.

Who can support me?

The team Technology Enhacned Learning & Teaching can help you with questions or support requests regarding AI in education. You can reach out to them via mail.

You can call me TELT
Technology Enhanced Learning & Teaching
Supporting you in the use of technology in education

New technologies provide educators with a lot of possibilities. The TELT team assists teaching staff in the effective use of new technologies in their courses.