AI chatbots like Openai's ChatGPT have been repeatedly shown to provide misinformation, hallucinating fully constructed sources and facts, and people are confidently confused about the wrong answers to questions. As such, AI tools are skeptical by many educators. So of course, Openai and its competitors are targeting universities and promoting its services to students.
According to The New York Times, Openai is in the midst of a huge drive to make ChatGpt a university campus fixture, replacing many aspects of the university's experience with AI alternatives. The company hopes that college students will have “personalized AI accounts” as soon as they step into campus, just like how they receive school email addresses, according to the report. It envisions ChatGpt that acts as anything, from personal tutors to teacher aides to career assistants who help students find work after graduation.
Even though the world of education initially greets AI with mistrust and total bans, some schools have already bought it. During the era, schools such as the University of Maryland, Duke University and California State University all began signing up for Openai's premium service, ChatGpt EDU, and integrating chatbots into different parts of their educational experience.
It's not just that aiming for higher education. Elon Musk's Xai provides students with free access to chatbot Grok during exam season, and Google currently offers students the Gemini AI suite for free from the end of the 2025-26 academic year. But that lies outside the actual infrastructure of higher education that Openai is trying to run.
A university that chose to embrace AI is a pity after it first gained a stubborn position over fear of fraud. If your goal is to learn and retain accurate information, there is already a fair amount of evidence that AI is not that beneficial. A study published earlier this year found that reliance on AI can erode critical thinking skills. Others have similarly found that people “offload” more difficult cognitive tasks and rely on AI as shortcuts. If university ideas are to help students learn how to think, AI will undermine it.
And that's before you get into all that misinformation. To see how AI can help in a focused educational environment, researchers trained various models in patent law casebooks to see how they performed when they asked questions about the material. They all created false information, hallucination cases that were not present, and made errors. Researchers reported that Openai's GPT model provided responses that were “unacceptable” and “harmful to learning” about a quarter of the time. That's not ideal.
There are other harms to consider, given that Openai and other companies want to see chatbots seep in not only in their classrooms but also in all aspects of student life. Reliance on AI chatbots can have a negative impact on social skills. And the simple fact that universities are investing in AI means they are not investing in areas that generate more human interactions. For example, students who go to meet their tutors use their emotional intelligence to create social interactions that need to establish trust and connections, ultimately increasing their sense of community and belonging. Chatbots simply vomit answers, sometimes correct or not.
