Lack of guidelines and expertise proves difficult to use AI in schools

Applications of AI


Emily Musil reviews report cards for 11- and 13-year-olds and looks at typical categories like language arts, math, and social studies.

But she hopes that one day, a new metric will emerge to assess all children: artificial intelligence literacy.

“Yes, I think we can get there,” Moussil said, noting recent rapid advances in what is being offered in elementary schools, from typing lessons to computer literacy instruction to coding instruction. But now, “as parents, we don’t know how our children understand deep computing and AI tools. This needs to change.”

Mr. Musil is Managing Director of Social Innovation at the Milken Institute, a nonprofit think tank. She was the lead researcher on a report released in November focused on building the nation’s talent engine in the age of AI.

“What choices do you need to make when you value financial mobility?” she asks. “We are falling behind because technology is advancing so rapidly and becoming connected to every job.”

The report called on K-12 educational institutions to emphasize AI literacy along with critical thinking and decision-making skills. But getting to that point in an expanded curriculum, let alone tackling the subtleties of AI technology, may be difficult. Achieving this requires the concerted efforts of educational institutions, schools, and leaders.

Lack of standards and expertise

Federal standards for AI education began in the Obama administration and were recently reinvigorated by the Trump administration with the presidential action Advancing Artificial Intelligence Education for America’s Youth. But when it comes to local implementation, it’s largely up to schools and administrators, and more than half of America’s schools and school districts, many of which are rural or classified as Title 1, have no standards at all.

According to the report, 60% of U.S. schools or districts lack guidance on the use of generative AI. Many schools have communicated edsurge Because technology is changing so rapidly, decisions are often left to the teacher’s discretion.

The lack of standards could also be a lack of expertise regarding AI and technology overall in the classroom. For example, Milken’s report found that only 17 percent of today’s computer science teachers have a computer science degree. The report does not delve into what these teachers might have majored in instead, but added that some teachers are asked to cover subjects as their scope of work increases.

The same phenomenon could occur with AI literacy curricula.

“If you’ve been a teacher for 20 years, all of a sudden you might not be an expert on medieval history, but you had to do something about it,” she says. “So you’re teaching them things that they don’t necessarily have deep skills.”

collective action

The report includes four specific focuses for K-12 schools. Ethical and critical use of AI tools. Combining human cognition and the use of AI. You learn not only through screens, but also through dialogue with people.

For students, “K-12 education is often the first place they encounter STEM and computing topics,” the report states. “K-12 has become an even more important point of intervention as an AI-driven workforce demands specialized skills earlier. Building future-proof curriculum and support systems can address gaps early and support student growth.”

Those are lofty goals. A related challenge is the lack of female students pursuing STEM fields. The report found that nearly half (49 percent) of elementary school computer science students are girls. That drops to 44 percent by middle school, 33 percent by high school, and about 20 percent by college graduation.

Milken’s report acknowledges that there are no easy, silver bullet solutions to achieving these goals. Necessary federal efforts are underway. And Musil suggested that employers and individual philanthropists can help fund schools, advocate for curriculum changes, and collaborate to benefit both students and recruiting organizations.

“This report makes clear that this challenge is national and the solutions must be collective,” said Michael Ellison, co-founder and CEO of CodePath, a nonprofit focused on diversifying the technology industry. The organization helped produce the Milken Institute report. “Philanthropists, industry leaders, policymakers, and educators must all act to reimagine education and workforce systems for an AI-driven world.”

AI integration risks

However, there are also considerations when integrating rapidly changing technologies. A report released last month by the Center for Democracy and Technology found that the introduction of AI in schools is associated with an increased risk of worse outcomes for students. Half of the students surveyed said that using AI in class makes them feel less connected to their teachers.

“While many are touting the potential of AI to transform education, we cannot afford to let negative impacts on students get lost in the chaos,” said Elizabeth Laird, director of CDT’s Sociotechnical Equity Project, in a statement. “Our research shows that the use of AI in schools comes with real risks. By recognizing these risks, education leaders, policy makers, and communities can begin prevention and response efforts to ensure that the active use of AI is not overshadowed by harm to students.”

And the Department of Education warned about its unchecked use in a 2023 report titled “Artificial Intelligence and the Future of Teaching and Learning.”

“We especially urge leaders to avoid celebrating the magic of AI or focusing only on promising applications and outcomes, and instead ask critically how AI-enabled systems and tools work in educational environments,” the report said.

But Musil points out that regardless of whether a school has specific rules regarding the integration of AI, students will be using it in their free time, so it’s best to teach them how best to avoid these negative consequences.

“My daughters are being told that the AI ​​is cheating, but there is a lot to do with AI-based pedagogy that will determine their future,” she says. “When we hire, we want someone to use AI to understand when fraud is occurring, when fraud is occurring, when it supports human thought, and when it replaces human thought.”



Source link