For many students over the last five years, coping with college turmoil meant finding ways to do assignments without actually doing them.This is called contract fraud and it’s getting tough A multi-million dollar business.
With the rise of accessible artificial intelligence (AI) chatbots such as ChatGPT, students are now able to retrieve assignments to submit to class without human intervention. The output of AI chatbots is spotty and can spew out a lot of irrelevant or misleading information.
Student access to cheating is becoming more sophisticated, and institutions and teachers are rushing to create policies and precautions to address it. Some of these tools are designed and tested in our own backyard.
What is contract fraud?
Cheating is the act of entrusting school tasks to a third party. This can be done through peers, websites or social media and is usually done in exchange for a fee. Countless online hubs, sometimes called essay mills, allow students to avoid being flagged for plagiarism by pasting their name, whether it’s a solution to a homework problem or an entire essay. while providing jobs for purchase to students of various majors and levels.
Then the third party involved is a human being. Submitting AI-generated work is still defined as contractual misconduct because the original work was outsourced and falsely labeled as done by a student.
The practice is frowned upon in academia because its affordability and convenience lead students to opt out of engaging in class content.
Contract fraud and vicarious writing have existed in many forms for centuries. Think writing music lyrics for someone else, or asking someone you know to do a job for you in exchange for pocket money. However, the Internet facilitated that process, and the emergence of freelance sites such as Chegg, Course Hero, and Fiverr quickly increased their popularity. These sites allow students to digitally connect with people, and now AI, who do their homework for rewards.
How will artificial intelligence affect contract fraud?
Paid contract fraud and AI-powered contract fraud are attractive to students for the same reasons. Both are difficult to detect plagiarism.
“Teachers don’t have the ability to realize that something they don’t have is copied from something they don’t have. Because if you assume you’re creating a new work product in , it doesn’t match anything,” said James Walden, Ph.D., professor of computer science. “But that’s a pretty big assumption.”
It is common for assignment contractors to receive requests for similar assignments, which can result in templates and standardized work being flagged as unoriginal. On the other hand, each output by an AI chatbot is idiosyncratic, reducing visibility by instructors and tools looking for cheats.
AI Chatbots Are Hampering Some Essay Mills, Some Contractors Make A Living From Trade report business downfall.
How is contract fraud detected and punished?
Software like TurnItIn is used to flag plagiarism in academic settings. It compares language patterns against repositories of other works, checks for correct citations, and determines with varying levels of accuracy whether a paper contains plagiarized information. However, because the work is original, even though it wasn’t created by the student, contract fraud (and now his AI-generated work) is typically not flagged.
A new tool called Auth+, designed by Northern Kentucky-based education technology startup Sikanai, takes a different approach to verifying student authorship. The software connects to learning platforms such as Canvas and BlackBoard and asks students a series of AI-generated questions to test their familiarity with the text.
“We ask questions about your writing style, content, content, and your memory of what you’ve written,” said Barry Burkett, co-founder and executive director of Sikanai. The choices you make and your answers give the instructor a score of how knowledgeable you are.”
Auth+ is being piloted by NKU at Haile College of Business and other colleges and universities such as Georgetown College. The Institute of Corporate Accreditation of Pakistan and the University of Manitoba. Burkett said the software has been well received so far.
Currently, the technology only tests your familiarity with written works, but as research, development, and funding grow, Burkett expands the technology to include computer science, languages other than English, and more. , would like to test their familiarity with works in other research areas.
Burkette said the Auth+ model for cheating prevention maintains academic integrity and holds students accountable for their submissions without being invasive or discreditable. said. He argues that programs such as his TurnItIn and his LockDown Browser that access and analyze student data violate the trust of students and teachers.
Instructors differ in software ethics trying to detect plagiarism and contract fraud. Dr. John Alberti, Head of the English Department, said many English teachers agree that checking students’ writing with tools like TurnItIn is a betrayal of trust.
“It automatically puts all students under suspicion,” Alberti said.
Tools like Auth+, despite the lack of data collection, still cross the line of trust between students and instructors, Alberti argues.
“This is still a policing tool,” Alberti said.
Common practices in English studies to encourage academic honesty and avoid the use of such tools, according to Alberti, include multiple drafts and workshop activities that emphasize the writing revision process. is to design a writing task in
However, in foundation courses, especially in a student’s particular field of study, ensuring that the student masters the fundamentals as they progress through the university program and into the workforce is essential to competence. Walden fears AI chatbots are so capable of generating accurate base code that computers will find it difficult to design challenges to prevent cheating in introductory coding classes. said. A front-end fraud detection tool could be a necessary part of the submission process if it means students can’t skip learning, Walden said.
Sometimes discerning students abusing contract fraud can be apparent, Walden said. Student work that contradicts the quality they typically submit is a telltale sign.
If this happens, Walden says he will investigate popular contract fraud sites such as Chegg and Freelancer for his work. Many contract his cheat sites disguise themselves as legitimate academic resources, so they usually have cheat research functionality built in, he said. If you find your own assignment in the database, you can track his transaction record to see if the student has purchased the assignment’s solution.
However, AI chatbots present a new set of problems for detecting contract fraud cases.One solution that is being hastily pursued is the subtle watermark To identify the work generated by a particular chatbot cloud, Walden said.
Irrespective of how contract fraud was carried out, asking suspected students to describe their thought process for arriving at a solution is usually a check to see if they understand the task. can.
“Tell me about the challenges, especially those that gave really professional results. “With code, it quickly becomes apparent that the solution cannot be explained.”
And as AI tools gradually seep into educational and professional standards, policies at the institutional and classroom level are likely to move forward to clarify appropriate and inappropriate uses of AI tools. I have.