The Sydney team develops AI models to identify ideas from brain waves

AI News


What would you do if you could just control your mobile phone just by thinking about it?

And imagine your phone automatically strengthens your focus and memory? Or is it used to read other people's minds?

It sounds like science fiction, but this technology is called the interface of brain computers and is devoted to the emergence of artificial intelligence (AI).

Australian researchers at Sydney Institute of Technology (UTS) are at the forefront of exploring ways to read our minds using AI.

This walks through how it works.

Use AI to read your mind

Illustration of a man with an electrode cap sitting in front of a computer, with a detector on the stand behind him.

The electrode cap is connected to an amplifier that reads brain waves and supplies data to the AI ​​model. (ABC News: Sharon Gordon))

Postdoctoral researcher Daniel Leon sits in front of a Graphenex-UTS computer wearing what appears to be a rubber swimming cap with a wire coming out.

The 128 electrodes in the cap detect and record electrical impulses in Dr. Leon's brain cells.

This is called EEG, a technique used by doctors to diagnose brain conditions.

The UTS team uses it to read his thoughts.

A pioneering AI model developed by Dr. Leong, PhD student Charles (Jinzhao) Zhou and his supervisor Chin-Teng Lin uses deep learning to translate EEG brain signals into specific words.

Deep learning is a form of AI that uses artificial neural networks to mimic how the human brain functions to learn from data. In this case, there are a lot of EEG data.

Man with electrode cap, amplifier and computer screen, mouth open.

Dr. Leon thinks about each word, quietly thinks about the mouth, and strengthens the areas of the brain that are involved in speech recognition. (ABC News: Sharon Gordon))

Dr. Leon slowly and quietly reads the simple phrase “Jumping Happy Just Me” on the screen.

He also says words. This helps to detect them by sending signals to the brain and activating parts related to the sound.

The AI ​​model works instantly to decode words and create probability rankings based on what we learn from many EEG waves from 12 volunteers reading the text.

At this stage, Professor Lin says that the AI ​​model learned from a limited collection of words and sentences to make it easier to detect individual words.

Behind the head of a person wearing an electrode cap, a computer screen shows a row of words.

The AI ​​model detected individual words based on brain wave patterns. (ABC News: Sharon Gordon))

A second type of AI, a large language model, matches the deciphered word, corrects EEG coding errors and comes up with sentences.

Large-scale language models like ChatGpt are trained on huge text datasets to understand and generate human-like text.

“I'm jumping to happiness, it's me alone” is a sentence that the AI ​​model came up with. There is no other opinion from Dr. Leon except his brain waves.

Side view of someone wearing an electrode cap. I have a computer with two statements in the background.

The AI ​​model came up with a prediction statement based on Dr. Leon's brain waves. This is closer to the original brain he read. (ABC News: Sharon Gordon))

Like a lot of things AI does at the moment, it's not perfect.

The team is looking for more people to read the text while wearing EEG caps to improve their AI models.

It also tries to communicate between two people using the AI ​​model.

Brain computer interfaces have been around for decades

Illustration of a brain with network patterns behind turquoise.

The technology to read brain signals is steadily improving. (ABC News: Sharon Gordon))

Twenty years ago, a man with quadriplegia had a device embedded in his brain, allowing him to control his mouse cursor on the screen.

It was the first time a brain computer interface was used to restore functions lost due to paralysis.

High-tech billionaire Elon Musk is working on a modern version of this implantable technology to restore autonomy to people with quadriplegia.

Non-invasive EEG brain computer interfaces have the obvious advantage of being portable; Although no surgery is required, the signal is loud because it is located outside the brain.

Man wearing glasses and a brown jacket in a room with a computer screen behind him.

According to Chin-Teng Lin, AI models can identify words in the noise generated by electroencephalography signals. (ABC News: Warwick Ford))

“We can't be very accurate because we can't actually put it in that part of our brain that deciphers a word.”

Professor Lin said.

“There's some confusion too, as the signals measured on the skull surface come from different sources and they mix together.”

That's where AI comes into play.

Amplify and filter brain signals to reduce noise and generate audio markers.

Mohit Shivdasani is a bioelectronics expert at NSW University.

Researchers were searching for patterns of biological signals “forever,” but now AI can recognize brain wave patterns that have not been previously identified.

He said that AI can quickly personalize brain waves about how individuals complete tasks, especially when used on embedded devices.

Man wearing a white lab coat in a laboratory with computer in the background.

Mohit Shivdasani says that AI has great potential to detect unknown brain wave patterns involved in cognitive function. (ABC News: Andrew Whiten))

“What AI can do is learn very quickly what patterns they will deal with and what patterns they will deal with, and the patterns revealed to one person may be completely different from those revealed to another,” he said.

Professor Lin said that by using “Neurofeedback,” they are doing exactly what they are doing to improve their AI models.

“We call this technology a kind of AI human collaborative learning to help AI learn better,” he said.

The team has achieved approximately 75% accuracy in converting thoughts to text, and Professor Lin said he aims to be 90%, similar to what the implanted model achieved.

Medicine and the great possibilities beyond

Man wearing a white lab coat looking at the laboratory equipment while pipetting the solution into a beaker.

Dr. Shivdasani says AI's mind-reading techniques can be used for stroke rehabilitation and speech therapy for autism. (ABC News: Andrew Whiten))

Dr. Shivdasani said non-invasive EEG using mind-reading AI could potentially manage stroke patients in hospitals.

“One of the great things about the brain is its ability to heal, so you can see the situation. During the rehabilitation stage, an autonomous brain machine interface is used, allowing the brain to continue working and trying certain tasks,” he said.

If brain cells regenerate, patients may no longer need technology, he said.

Supporting speech therapy for people with autism is another potential use.

The use of such rehabilitation relies on a “closed loop” brain computer interface, where real-time feedback is generated from user brain activity.

Diving into the realm of science fiction is the possibility that this technology will strengthen our attention, memory, focus, and even emotional regulation.

“As scientists, we look at the medical condition and look at the functions affected by that condition. What is the need for a patient? We then tackle the unmet need through technology to restore that function,” Dr. Shivdasani said.

“The sky is at its limit after that.”

Three men, one, wearing a cap with electrodes, stands in front of a computer screen in the room.

The UTS team is working to perfect the AI ​​model to read thoughts in their minds. (ABC News: Warwick Ford))

Before you can take the operation of your phone to mind or communicate directly from your brain to your brain, technology needs to be more “wearable.”

No one walks around with a cap and the wires don't come out.

Professor Lin said the technology can interact with devices such as augmented reality glasses on the market.

Big Tech is already working on earphones with electrodes to measure brain signals.

Then there are our “brain privacy” and other ethical considerations, Dr. Shivdasani said.

“We have tools, what do we use them for? And how ethically do we use them?



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *