A look at AI in the classroom: Cornell professor deploys AI platform to manage coursework and class organization

AI News


Various classes at Cornell University are beginning to integrate artificial intelligence to supplement student learning and improve efficiency.

Cornell University recently appointed Torsten Joachims, a professor of computer science and information science, to a new position as director of artificial intelligence strategy, effective January 1.

In a statement to The Sun, Joachims wrote that when he taught CS 3780, “Introduction to Machine Learning,” he allowed the use of generative AI in his projects. Instead of spending hours “attacking the TA” trying to debug difficult code, students were able to ask questions related to the course content.

“It was less painful and I was able to interact with my TAs about interesting machine learning questions rather than why my code was throwing mysterious errors,” Joachims wrote.

Professor Stephen Jackson, vice provost for academic innovation (information sciences), wrote that the use of chatbot tools allows students to get instant answers to simple questions, reducing the need to rely on office hours for trivial questions.

“We look forward to these experiments. [of classes using chatbot tools] It has the potential to continue and expand as we see evidence that it enhances student learning and engagement,” Jackson said in a statement to The Sun.

As the use of AI to supplement instruction increases, the Cornell University GenAI Education Working Group, which includes faculty, staff, and students, is discussing questions about AI in the classroom. The group is part of the university’s AI Advisory Board. Their goal is to ensure students receive quality instruction, feedback, and grading.

According to its website, these goals will be realized through research on GenAI, implementation of a campus-wide program on critical AI literacy, and collaboration with Cornell Information Technology and researchers to evaluate new software.

One of the new AIs, HiTA, is used in departments such as the language, information science, and computer science programs, as well as throughout the school, including the SC Johnson College of Business and Weill Cornell Medical College, said René Kizilsek, a professor of information science.

According to its website, HiTA provides academic support to higher education through teaching and learning and operates differently from generative AI platforms such as ChatGPT, Microsoft Copilot, and Google Gemini.

“[Generative AI] “It’s designed to give answers right away and solve problems right away, which is not what you want in education, where learning is a learning process and where learners need to be intensely reflective and engage in an intentional process,” Kizilsek said in an interview with The Sun.

Unlike these generative AI chatbots, HiTA does not directly solve students’ homework problems. According to Kizilcec, the purpose of the site is to provide hints and help students understand concepts.

Cornell University classes typically use Canvas as a site for course organization, presentations, homework, and exams. Classes such as INFO 4100: “Learning Analytics,” which Kizilcec teaches, use HiTA instead to provide course assistance and feedback to students.

In INFO 4100: “Learning Analytics,” course materials are published through HiTA, and the program grades homework assignments in parallel with “comprehensive verification by TAs,” according to the course syllabus. Students can ask HiTA chatbot for homework help.

HiTAs are specially trained to have knowledge of the courses used so that they can assist students.

“This is very important because HiTA is based on course material and every course is slightly different,” Kizilcec said. “I don’t want to confuse students with terms or concepts that aren’t actually in the course.”

The software also allows instructors to monitor the AI. Kizilcec said he has the authority to review chatbot responses and supervise HiTA to ensure responses are not too technical or vague.

Kizilcec said HiTA teaches students, “If… [he] I had unlimited time to answer every question at every odd hour of the day. ”

Cameron Pien ’27, a student in INFO 4100, said HiTA increases efficiency in class because students have access to support at all times. In previous classes that didn’t use the AI ​​platform, Pien said it took a long time for students to receive help with code.

“If you had to ask a human TA for help with your code, you would have to wait a long time,” Payne says. “It sometimes took me a while to understand how the computer interface worked specifically.”

In addition to assisting students, large-scale language models are also used to summarize the conversations students have had and identify the main themes that arose throughout the conversation.

Kizilcec reads the overview to understand common questions and misconceptions. Based on these results, he[s] I’ll bring it up as something in the TA meeting. [they] If relevant, it should be explained in the section, Kizilczyk said.

AI assistance is also being introduced to non-STEM courses. Another course that incorporates AI is SPAN 2090, “Intermediate Spanish I.” According to the website, this class uses Chitter Chatter, which can be used as a conversation partner to improve language fluency.

Elly Burlier ’29, a SPAN 2090 student, said the class allows students to earn extra credit by completing weekly assignments that include conversations with an AI. Chitter Chatter speaks to students in Spanish, and once their work is submitted, the AI ​​gives them feedback on where to improve.

Mr Berlier expressed concern about taking advantage of additional credit opportunities.

“I have a hard time with it because I am ethically very opposed to AI,” Berlier said. “I admit [the] I think environmental impact and human dependence on the environment is a problem, but I don’t like it being introduced too much into education. ”

At the same time, Burlier hopes to benefit from additional credit opportunities and feedback provided by Chitter Chatter.

“We also feel that this platform is actually very useful,” Burlier said. “It was great to be able to practice conversation.” [in] It’s a less risky method since you’re just talking to a computer. ”

Pien told The Sun that he has noticed a diverse and gradual acceptance of AI by professors.

“I feel like I’m noticing more diversity in how different professors are coming up with AI policy,” Pien says. “We’re seeing a lot more flexibility in the way professors are taking it and maybe thinking about it. [as] It’s more like a social necessity than a threat. ”

Pien explained that while AI has been implemented smoothly in STEM classes, she would feel more negative if it were introduced in humanities courses.

“If we’re talking about very simple code where what the correct answer is, [it’s] “That’s totally fine, but if this were introduced into a humanities class, I would feel more negative about the AI ​​grading my work,” Pien said. Specifically, she’s tired of AI grading her students’ writing because the grading criteria isn’t that simple.

Kizilcec acknowledged that there is a fine line between when AI should and should not be used in education.

“I’m not advocating for everyone to use AI everywhere,” Kizilcec says. “What’s really important is to be very deliberate and intentional about how we incorporate AI into the work that we do.”

The increased use of AI in the classroom does not reduce the responsibility placed on teachers. They are still expected to take an active role in the classroom and engage with students.

“Instructors are ultimately responsible for the accuracy and validity of the grades they assign, and we take that responsibility seriously,” Jackson wrote. “We believe all of this work is an extension and enhancement of faculty and TA feedback, not a replacement for it.”


read more





Source link