Third-year high school science project could use AI to prevent suicide

Machine Learning


For a local science fair, he designed an app that uses AI to scan text for signs of suicide risk. He thinks it could one day help replace outdated diagnostic methods.

“Sometimes our writing patterns reflect what we think, but they don’t really extend this far,” he said.

The app brought him national recognition and earned him a trip to Washington, DC and a speech on behalf of his colleagues. This is one of many efforts underway to use AI to support the mental health of young people and better identify situations at risk.

Experts say this type of AI, called natural language processing, has been around since the mid-1990s. And it’s not a panacea. “Machine learning is helping us improve. The more data we have, the better we can improve the system,” said Harvard University’s psychology professor who studies youth self-harm. says Matt Knock. “But chatbots are no silver bullet.”

Personalized tools like Pachipara could help fill the void, said Nathan Demers, a Colorado psychologist who oversees mental health websites and apps. “When you go into the CVS, there’s a blood pressure cuff,” Demers said. “Maybe that’s the first time someone realizes, ‘Oh, I have high blood pressure.'” I never knew. ‘ ‘

He hasn’t seen Pachipara’s app, but he theorizes that innovations like his might increase self-awareness about underlying mental health issues that might otherwise go unrecognized. .

Construction of Suisensor

Siddhu Pachipara (Chris Ayers Photography/Science Society)

Pachipara worked on designing an app that people can download to self-assess their suicide risk. They can use the results to advocate for care needs and connect with providers. After spending late nights coding, he did this. swiss sensor.

Using sample data from a medical study based on diary entries by adults, Pachipara said: swiss sensor It predicted suicide risk with 98% accuracy. Although this was only a prototype, the app was also able to generate contact lists for local clinicians.

In the fall of her senior year of high school, Pachipara entered her research into the Regeneron Science Talent Search, an 81-year-old national science and math competition.

There, a panel of judges challenged his knowledge of psychology and general science with questions such as: “Explain how pasta is boiled. … Well, let’s say you brought it into space. Now what happened?” “You came out of the panel beaten and bruised, but it was good.”

He finished ninth overall in the competition and won $50,000 in prize money.

“His research suggests that the semantics of an individual’s writing may be correlated with that person’s psychological health and suicide risk,” Judge found. The app is currently not available for download, but Pachipara hopes to continue developing it as an undergraduate at his MIT.

“We don’t think that’s enough. [suicide intervention] Let’s look at it from an innovation perspective,” he said. “I think we’ve been stuck with the status quo for a long time.”

Current AI Mental Health Applications

How does his invention fit into broader efforts to use AI for mental health? As one of them, he expressed concern about the misinformation. He applies machine learning to electronic health records to identify people at risk of suicide.

“Most of our predictions are false positives,” he said. “Is there a cost to it? Is it harmful to tell someone that they are at risk of suicide when in fact they are not?”

And data privacy expert Elizabeth Laird is concerned about adopting such an approach, especially in schools, given the lack of research. She leads the Equity in Civic Technology project at the Center for Democracy & Technology (CDT).

“We are facing a mental health crisis and we should do everything we can to prevent students from harming themselves,” she said. We remain skeptical about the lack of “independent evidence that

This focus on AI comes at a time when the rate (and risk) of suicide among young people is rising. Data lags behind, but the Centers for Disease Control and Prevention (CDC) reports that suicide is her second leading cause of death among adolescents and young adults ages 10 to 24 in the United States.

Efforts like Pachipara are embedded in a wide range of AI-powered tools that can be used by clinicians and non-professionals alike to track the mental health of young people. Some schools use activity monitoring software that scans devices for signs that students are in danger of harming themselves or others. One concern, however, is that when these red flags surface, the information may be used to chastise students, rather than to support them, and that “that discipline is not in line with racial boundaries.” It’s about being there,” Laird said.

According to a survey shared by Laird, 70% of teachers using data tracking software in their schools said data tracking software is used to discipline students. Laird said schools can stay within the Student Records Privacy Act but have failed to put safeguards in place to protect them from unintended consequences.

“Privacy discussions have moved from being just legally compliant to actually being ethical and just,” she says. She reports that nearly 1 in 3 of her LGBTQ+ students have been outdated or know someone who has been outdated by activity monitoring software. pointed out survey data showing

Harvard University researcher Matt Knock recognizes the role of AI in numbers processing calculations. He uses machine learning technology similar to Pachipara to analyze medical records. But he stresses that more experiments are needed to examine the computational estimates.

“A lot of this research is really well-intentioned and trying to use machine learning and artificial intelligence to improve people’s mental health… but unless we do some research, we’re not sure if this is the right solution. I don’t know,” he said. .

More and more students and families are turning to schools for mental health support. Software that scans the words, and therefore the thoughts, of young people is one of her approaches to understanding the mental health landscape of young people. But it can’t replace human interaction, Knock says.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *