This essay is based on a transcribed conversation with Lisa Morimoto, a senior lecturer in economics at the University of London, England. The following has been edited for length and clarity.
Students always cheat.
I have been a lecturer for 18 years and have been dealing with fraud during that time, but have experienced significant changes as AI tools have become widely available in recent years.
There is definitely a positive aspect to AI. Access to information is much easier, and students can use these tools to improve their writing, spelling and grammar. So there are essays written badly.
However, I think some of my students use AI to generate essay content that draws information from the internet rather than using class material to complete assignments.
AI is supposed to help us work efficiently, but that's why my workload is skyrocketing. I have to spend a lot of time figuring out whether the work my students are handing over is really written.
I decided to take dramatic behavior and changed the way I evaluate students to encourage them to be more creative and less reliant on AI. The world is changing, so colleges cannot stand still.
Fraud is becoming more difficult to detect due to AI
I have been working at the University of London since 2012. My focus in education is ecological economics.
Initially, my teaching style was based on exams, but students were worried about one-off exams and found out that results didn't always correspond to performance.
I finally focused on the essay. Students selected their topics and integrated the theory into their essays. It worked until AI came along.
The fraud was easy to spot. Copying a huge chunk of text from internet sources could lead to one or two students cheating, leading to cases of plagiarism. Even two or three years ago, it was easy to detect inappropriate AI use due to signs like the writing style of robots.
I think that more sophisticated AI technology is making detection difficult and the scale of fraud is increasing.
I read 100 essays, some of them are very similar using examples of the same case that I have never taught.
These examples are usually referenced on the Internet, so I think students are using AI tools that incorporate them. Some essays quote 20 literature, but not from the reading list I set.
Students can use examples of internet sources in their work, but some students are worried that they used AI to generate essay content without reading or being involved in the original source.
I've started to evaluate my work using AI detection tools, but I know that there are limitations to this technology.
AI tools are easily accessible for students who feel pressured by the amount of work they have to do. With college fees increasing and many students work part-time, it makes sense to use these tools to get the job done faster.
There is no obvious way to determine fraud
During the first lecture in my module, I tell students that they can use AI to check grammar and to better understand the literature, but I cannot use it to generate responses to assignments.
SOAS has guidance on AI use among students and sets up a similar principle of not using AI to generate essays.
For the past year, I have sat on the academic fraud panel at the university to deal with students who have been flagged for inappropriate AI use across departments.
Students refer to these guidelines and say they use AI to support learning and not write responses.
It can be difficult to make a decision, as you cannot read the essay and be 100% sure, whether it is being generated or not. It is also difficult to draw a line between fraud and using AI to support learning.
Next year I will dramatically change the assignment format
My colleagues and I are talking about the negative and positive aspects of AI, but we still know that there is a lot to learn about our own technology.
The university encourages lecturers to change their education and assessment practices. At the departmental level, we often discuss ways to improve things.
I will send two young children to schools with alternative, progressive education systems rather than mainstream UK state schools. Looking at how my kids are educated, I have been trying two alternative assessment methods this year. I had to go through a formal process with the university to approve them.
Ask students to select a topic and create a summary of what they learned in the class. Secondly, they create blogs, so they can translate their understanding of highly technical terms into a more contagious form.
My goal is to make the assessment more personal and creative by making the assignments more personal and creative.
Old evaluation models that memorize facts and regurgate them in exams are no longer useful. ChatGpt can easily provide a beautiful summary of such information. Instead, educators need to help students with soft skills, communication and ready-to-use thinking.
In a statement to BI, a SOAS spokesperson said students are being guided to use AI in a way that “maintains academic integrity.” They said the university encouraged students to pursue a more difficult task for AI to replicate and put in place a “robust mechanism” to investigate AI misuse. “The use of AI is constantly evolving and we regularly review and update our policies to accommodate these changes,” the spokesman added.
Are there any stories to share about AI in education? Please contact this reporter ccheong@businessinsider.com.

