Academic Integrity and AI – Universities in a Dilemma – News

AI News

At one point, even the calculator, introduced in the 1980s, was seen as a threat to academic integrity.

read more…

by Professor Stephen Wilhite

release date: Tuesday, June 6, 2023, 11:24 p.m.

In March of this year, three academics from Plymouth Mahjon University published an academic paper titled “Chat and Cheating: Ensuring Academic Integrity in the Age of ChatGPT” in the journal Innovations in Education and Teaching International. It was peer-reviewed by four of his other scholars and approved for publication. What his three co-authors of the paper do not disclose is that it was written by ChatGPT, not them.

This case is a prime example of how AI-generated content poses a threat to traditional notions of authorship and intellectual property.

Over the years, many challenges to academic integrity have arisen. Essay mills that sell pre-written essays and other academic works to students have been around for a long time. At one point, even the calculator, introduced in the 1980s, was seen as a threat to academic integrity. In the 1990s, the commercialization of the Internet raised similar concerns.

Burden on educational institutions

But AI-generated content poses another level of threat, posing the greatest challenge to academic integrity today. The level of writing and quality of submitted work is often so excellent that it is becoming increasingly difficult to prove whether a document or deliverable was written or produced by a machine. . Even if the language, the quality of grammar, and the sophistication of the content appear to be superior to what the student has produced on his own, the faculty member supervising the student’s work believes that the work was produced by his AI. You may not be able to prove something conclusively.

Artificial intelligence (AI) can be a valuable resource for students, but it also poses significant challenges to academic integrity. The use of AI tools leaves room for cheating, plagiarism, and fabrication. Institutions therefore have a responsibility to ensure that their students use AI ethically and in accordance with their academic integrity policies.

The use of AI-generated content in classrooms is leading universities to revise their academic integrity policies. For example, my own university, Ras Al Khaimah American University, currently advises students: or attempt to use the equivalent of unauthorized assistance. Please Note: Instructors may prohibit the use of generative AI, including but not limited to generative AI tools such as Open AI ChatGPT and Canva, when completing assignments. If such prohibitions are communicated by faculty, the incorporation of information from such sources into assignment submissions will be treated as a serious violation of expectations of academic integrity. ”

At the same time, students should be prepared to use AI appropriately in the classroom. We need to understand not only the capabilities of AI tools, but also potential biases and errors. Students can use AI tools as an aid to their research and writing, but they should understand that they cannot rely entirely on them. You should be aware of the need to properly and honestly cite sources, and feel comfortable asking your instructor for guidance if you are unsure how to use AI tools ethically.

What is Academic Fraud?

Use of AI tools does not automatically amount to academic misconduct unless you clearly communicate to students that they are not permitted to use AI tools to complete assignments. Whether the use of AI tools is acceptable depends on how the tools are used. For example, if a student uses an app such as her ChatGPT to generate a draft and then revise and update it, the output from the AI-generated app often contains factual misidentifications. However, the student’s final draft should refer to his ChatGPT as a source for inclusion in the thesis.

Another reason to integrate AI into the academic learning environment is that AI tools are so widely used in industry that college graduates need to master these skills if they want to succeed in their careers. Therefore, it is important to teach students how to use artificial intelligence tools responsibly before they go out into the world.

Many in our academic community are concerned that as generative AI tools become more prevalent, it will become increasingly difficult to determine the intellectual and creative contributions of individual learners to the outcomes produced. to be. Traditional referencing of referenced sources may be a poor basis for documenting a learner’s original contributions to the work.

Nevertheless, the integration of AI into other technology tools is accelerating. It is very likely that artificial intelligence will be built into everyday word processing programs such as Microsoft Word and Google Docs. Humans and AI-based technology tools are increasingly co-authoring texts for different uses and different contexts.

Rethinking evaluation

The World Economic Forum (WEF) has issued guidelines for educators, recognizing the impact of the use of AI in academia and voicing concerns as we enter the second era of digital technology[ChatGPT と不正行為: 学生の学習方法を変える 5 つの方法]Graded ([ChatGPTandcheating:5waystochangehowstudentsaregraded(https://wwwweforumorg/agenda/2023/03/chatgpt-and-cheating-5-ways-to-change-how-students-are-graded/)[ChatGPTandcheating:5waystochangehowstudentsaregraded(https://wwwweforumorg/agenda/2023/03/chatgpt-and-cheating-5-ways-to-change-how-students-are-graded/)

According to the WEF, these technologies offer opportunities for educators to rethink assessment practices and engage students in deeper, more meaningful learning that can promote critical thinking skills.

The report states: “The advent of ChatGPT will revolutionize the traditional approach to assessing students for schools and institutions of higher education, which has relied heavily on tests and written assignments that focus on the basic integration of student recall, memory and content. We see it as an opportunity.” For educational institutions, it is neither beneficial nor practical to outright ban AI or applications like ChatGPT. ”

AI applications such as ChatGPT will bring about major changes in modern education, and in assessment systems as well. Even as AI software is embedded in the classroom, degree-granting institutions must ensure the legitimacy of the degrees awarded by ensuring that student learning outcomes reflect the knowledge and skills embedded in the student’s cognitive framework. must be able to defend

The WEF notes that for assessment to be effective, AI-powered work must be structured in a way that allows the learner’s unique contribution to the work to be identified. doing. To foster student awareness of situations in which her AI tools can be used appropriately in the learning process, WEC encourages students to seek opportunities to participate in setting learning objectives and standards used for assignments. . When assessing the achievement of goals. This recommendation relates to the most important aspect of knowing how students are graded or assessed in formal teaching and learning programs.

real assignment

Even if AI tools help identify and organize important information and practice the necessary skills, assignments and their assessment are essential to ensure that students are internalizing knowledge and skills. Must be genuine. The Center for Innovative Education and Learning at Indiana University Bloomington describes a genuine challenge as “where students need to apply what they have learned to new situations, determine what information and skills are relevant and how they should be used.” “issues that require judgment”. (

Examples of genuine jobs offered by the Center include: Creating and presenting a marketing plan for a business, fictitious company. Computer Science involves developing and demonstrating apps to solve specific problems. The engineering department designed an improved battery for electric vehicles and presented its features to a panel of experts. However, even in these formal assessments, it is important to structure assessments so that supervisors can confidently determine that the work presented reflects the knowledge and skills internalized by the student. is to

Even with the increased use of authentic assessment, written submissions will continue to be an important means for students to demonstrate their learning. Universities should therefore pursue multiple ways to reduce the likelihood of using AI tools. For example, AI software does not have access to personal experiences or class events, so assignments can be created to require students to connect their personal experiences or events in class to course concepts. Students can also be asked to combine a short written submission with an oral question in class about that submission.

The University may require that you take notes during class meetings and will not tolerate the possession of electronic devices during such writing exercises. In other words, universities can encourage “flipped” teaching, where reading, lectures, watching videos, etc. are done at home and writing about the content during class.

Additionally, if written assignments are completed outside of class, universities should collect an in-class sample of student writing as a “baseline” against which written assignments completed outside of class can be compared.

However, the continued evolution of AI software means that some AI tools will be able to mimic the writing style of individual students. Therefore, universities want to identify software for detecting AI-generated content and make it available to faculty, but this means that the process of updating AI-detecting software must be ongoing. increase. Educators should also be aware of the strategies they can employ when identifying AI-generated content. For example, careful evaluation of the cited references may reveal “suspicious” or fabricated references.

The challenge for higher education institutions is to adapt student learning assessments to minimize the potential for AI-generated content to invalidate assessments, while ensuring that students can use AI tools honestly and effectively. to ensure that I am committed to helping faculty and their university leaders welcome and promote generation of knowledge over the centuries and to help new knowledge, and even generative AI, be used for greater purposes. I am confident that he will meet this challenge while proving himself to be a dedicated thought leader for centuries. good.

(Professor Stephen Wilhite is Senior Vice Chancellor/Chancellor of Academic Affairs and Student Success at Ras Al Khaimah American University)

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *