AI systems now track intentions formed in the brain

AI News


A major breakthrough came from a clinical trial in which paralyzed volunteers used implanted devices to control keyboards, robotic limbs, and synthetic voices. One participant, Nancy Smith, regained the ability to play simple melodies with an implant that analyzes brain activity.

When she imagined pressing a piano key, the system sensed her intentions before she consciously acted, making it feel as if the music was playing automatically. Researchers say this happened because the implant captured preconscious planning signals in the posterior parietal cortex, an area involved in reasoning and attention.

Teams working in this field report that these signals are mixed and may contain not only information about movement, but also information about decision-making, internal dialogue, and moment-to-moment intentions. Several groups have already shown that they can decipher snippets of internal audio and track how volunteers evaluate card choices during a game of blackjack.

A new frontier combining implants and AI. Developers at companies like Synchron and academic teams at the California Institute of Technology have demonstrated that AI models trained on extensive neural recordings can identify subtle patterns that were previously ignored as noise.

In an unpublished study, Synchron researchers found that their system can detect user errors before the user notices and preview how the BCI will intervene in real time. This creates a practical dilemma. Devices can automatically correct mistakes and improve performance, but in doing so they are acting on your behalf without your explicit knowledge or consent.

At the same time, consumer neurotechnology is advancing rapidly. Electroencephalography (EEG)-based headsets use AI to improve signal quality and provide feedback on focus, stress, and alertness. Although those recordings are much less accurate than those from implanted devices, they can still reveal how people respond to certain stimuli.

As clinical BCIs move closer to approval and consumer systems expand, experts say the central challenges are changing. Previous discussions focused on keeping brain data private. The focus now is on how these systems shape users’ behavior, with AI deciphering signals that reflect preconscious intentions. Some fear that AI-assisted BCIs, especially those that propose or draft communications, will begin to influence how users express and ultimately think.

Earlier, Qazinform News Agency reported that Baidu announced the next generation AI chip.



Source link