Can AI learn like we do?

Machine Learning


Schematic diagram of a machine learning model

image:

Diagram comparing a common machine learning model (A) with Daruwalla's new design (B). Row A shows that the input or data must pass through all layers of the neural network before the AI ​​model receives feedback, which takes more time and energy. In contrast, row B shows the new design where feedback can be generated and incorporated at each network layer.

View more

Credit: Kyle Daruwala/Cold Spring Harbor Laboratory

It reads, speaks, collates vast amounts of data, and suggests business decisions. Today's artificial intelligence may seem more human than ever before. But AI still has some significant shortcomings.

“ChatGPT and current AI technologies are all great, but they are still very limited when it comes to interacting with the physical world. Even tasks like solving math problems or writing an essay require billions of training examples before they can get good at them,” explains Kyle Daruwala, a NeuroAI researcher at Cold Spring Harbor Laboratory (CSHL).

Daruwala has been searching for new, unconventional ways to design AI that can overcome these computational obstacles—and he may have found it.

The key is moving data. Currently, most of a modern computer's energy consumption comes from moving data around, which can take a very long time in an artificial neural network made up of billions of connections. So to find a solution, Daruwala turned to one of the most computationally powerful and energy-efficient machines in existence: the human brain.

Daruwala designed a new way for AI algorithms to move and process data more efficiently, based on the way the human brain takes in new information. The design allows individual AI “neurons” to receive feedback and adjust on the fly, rather than waiting for the entire circuit to update at the same time. This way, data doesn't have to travel as far and is processed in real time.

“In our brains, connections are constantly shifting and adjusting,” Daruwalla says. “It's not like we can just pause everything, adjust it, and come back to ourselves.”

A new machine learning model provides evidence for an as yet unproven theory that correlates working memory with learning and academic achievement. Working memory is a cognitive system that enables you to focus on a task while recalling stored knowledge and experiences.

“In neuroscience, there have been theorizations about how working memory circuits drive learning, but nothing as concrete as our rule that actually connects the two. And that was one of the good things we stumbled upon here. The theory led to the rule that in order to regulate each synapse individually, you need working memory next to it,” Daruwala says.

Daruwalla's design could usher in a new generation of AI that learns like humans. Not only would it make AI more efficient and accessible, it would mark something of a full-circle moment for neural AI. Long before ChatGPT uttered its first digital syllables, neuroscience has been providing AI with valuable data. Soon, AI may return the favor.


Disclaimer: Neither AAAS nor EurekAlert! are responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for any use of information provided through the EurekAlert system.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *