This voice is automatically generated. Please let us know if you have any feedback.
Diving overview:
- The majority of both students (86%) and teachers (85%) reported. using artificial intelligence during the 2024-25 school year, according to survey data released Wednesday by the Center for Democracy & Technology.
- However, the CDT said the rapid increase in the use of AI in educational settings is leading to increased risks for students. Research shows that the more AI is used in schools, the more students are likely to report data breaches and ransomware attacks, sexual harassment and bullying, AI systems not working as designed, and interactions between students and AI tools.
- in Letter of October 7th In an email to U.S. Secretary of Education Linda McMahon, CDT and nine other education civil rights, library and technology organizations noted these risks and urged the U.S. Department of Education to consolidate its July education policy. Guidance on responsible AI use We manage grants and research programs for the implementation of AI in schools.
Dive Insight:
In a letter to McMahon, the group said its findings demonstrate the need to address the risks associated with widespread use of AI in schools, especially as the Trump administration continues to prioritize AI in education. Still, school district leaders have raised concerns about how the Department of Education can successfully implement the use of AI in schools. Department of Educational Technology closure earlier this year.
For example, a CDT study found that half of students say using AI in class makes them feel less connected to their teachers, and 38% say they feel more comfortable talking to an AI instead of their parents. However, only 11% of teachers reported receiving training on what to do if they believe a student is using AI in a way that could harm their well-being.
CDT highlighted concerns that some students are using AI tools to develop problematic relationships. In the 2024-2025 academic year, 42% of students said they or a friend used AI for mental health support, as a friend and companion, or as a way to escape from real life. About 19% also reported using AI to develop romantic relationships.
Children's media safety and mental health organizations are sounding the alarm and warning about this type of trend. Object to the use of AI companions Advocates say AI companions pose serious mental health risks to children and teens, especially those with conditions such as depression, anxiety, ADHD, and bipolar disorder.
To address the issue of AI companions, one mental health expert recently recommended that schools offer digital literacy programs to educate students about these tools. Schools are advised to make it clear to students that AI companions are not human and that important issues should be discussed with a trusted adult.
CDT also said AI could pose risks to schools' cyber defenses. The more teachers rely on AI for school-related tasks, the more likely they are to report a major data breach at their school.
CDT conducted an online survey between June and August 2025 among 1,030 students in grades 9-12, 806 teachers in grades 6-12, and 1,018 parents of children in grades 6-12.
Additionally, 36% of students surveyed reported the following issues within their schools: deep fake — Fake audio videos, photos, or audio recordings created by AI to appear real during the 2024-25 school year. Although concerns remain, less than a quarter of teachers said their school had announced a policy to deal with deepfakes, particularly those depicting sexually explicit images without a person's consent.
In May, President Donald Trump signed it into law. take it down actmaking it a crime to use AI to create deepfake images without the subject's consent.