New app uses AI to monitor early signs of mental health problems in young people

AI For Business


Emmanuel Akindele, founder and CEO of Blue Guardian in Toronto on April 6th.Christopher Kasaroff/Globe and Mail

When Emmanuel Akindale was in high school, he was afraid to talk openly about his struggles with anxiety.

“I remember the first time I actually shared it with my educators. They just straight up laughed in my face,” he said. “It was pretty disappointing.”

Now an economics student at Western University, Akindell is the co-founder of Blue Guardian, a new app that uses artificial intelligence to detect early signs of mental health problems in young people. He hopes the technology he created with his fellow student Kyle LaCroix can provide the kind of help he didn’t find when he was younger.

Blue Guardian will launch in Ontario on May 1, coinciding with the start of Mental Health Week in Canada.

Akindale likens the technology to spell-checking software for mental health. By downloading the app, young people between the ages of 7 and her 17 will be able to monitor the text that the AI ​​types into the device. Such content, whether in the form of social media, text her messages, or Google searches, is observed by AI for potential mental health cues.

Rather than focusing on specific words, the AI ​​model the app uses looks at the nuances of speech patterns between those with a “healthy mind” and those suffering from mental health issues such as anxiety. It’s trained to detect differences, Akindell said. Or depression.

Once the text data is collected, the app provides users with emotional insights such as ‘happy’, ‘sad’ and ‘neutral’. Also, based on the language the user types in, he may flag potential signs of depression or anxiety if the AI ​​detects them. If flagged, the app also suggests resources, such as counseling services, based on the data it collects and the background information you provide about yourself.

Children can then decide whether to share their emotional insights and flags with their parents by enabling them to scan QR codes available on the app, Akindale said.

Both children and parents can only view emotional insights and flags on the app. All text collected by the app is encrypted and rendered completely inaccessible, including to users and developers. After the encrypted text is processed to generate emotional insights, it is stored for about a week before being deleted, Akindele said.

Carolyn McGregor, Research Chair of Artificial Intelligence for Health and Wellness at the Ontario Institute of Technology, says consent is key when dealing with technology to keep young people mentally healthy.

Ontario’s Health Care Consent Act legally allows anyone to understand information relevant to their mental health treatment decisions without the consent of a parent or guardian. Parents can choose whether or not to be involved in mental health decisions. Dr. McGregor says it’s important to keep this in mind if your child chooses to download this app onto their device.

Her concern is more about what the AI ​​doesn’t get than what it monitors on young people’s devices.

“If you’re just reading purely, you’re missing out on all the genres of communication they use,” she said.

Many young people communicate using visualizations such as memes and GIFs, Dr. McGregor said, but the technology won’t be able to understand that. Girls are more likely to communicate visually than boys. Because of their high sensitivity and different levels of emotional intelligence, she said, there could be bias issues in how AI collects data.

Misty Pratt, a parent of two young children, ages 10 and 13, said the technology could help monitor her children’s activities online. Her eldest son now has her phone with TikTok. Pratt also has an account on social media apps where she shares videos with her daughter and monitors what her daughter posts, she said.

With the children’s consent, Pratt said he would consider downloading Blue Guardian onto their phones to better understand their mental health. She had been waiting for nearly a year to see a psychologist for one of her children. If the app helps eliminate the need to seek professional help in the future, she said, she welcomes it.

“When it builds and builds and gets worse and worse, things can get really bad,” she said. If we can give them the tools they need to… I hope it doesn’t get any worse.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *