It reads, speaks, collates vast amounts of data, and suggests business decisions. Today's artificial intelligence may seem more human than ever before. But AI still has some significant shortcomings.
“ChatGPT and current AI technologies are all great, but they are still very limited when it comes to interacting with the physical world. Even tasks like solving math problems or writing an essay require billions of training examples before they can get good at them,” explains Kyle Daruwala, a NeuroAI researcher at Cold Spring Harbor Laboratory (CSHL).
Daruwala has been searching for new, unconventional ways to design AI that can overcome these computational obstacles—and he may have found it.
The key is data movement: currently, most of the energy consumption in modern computing comes from data movement, which can take a very long time in artificial neural networks that consist of billions of connections.
So to find a solution, Daruwala turned to take inspiration from one of the most computationally powerful and energy-efficient machines in existence: the human brain.
Daruwala designed a new way for AI algorithms to move and process data more efficiently, based on the way the human brain takes in new information. The design allows individual AI “neurons” to receive feedback and adjust on the fly, rather than waiting for the entire circuit to update at the same time. This way, data doesn't have to travel as far and is processed in real time.
“The connections in our brains are constantly shifting and adjusting,” Daruwalla says. “It's not like we can just pause everything, adjust it, and go back to being our old selves again.”
The findings have been published in the journal The forefront of computational neuroscience.
A new machine learning model provides evidence for an as yet unproven theory that correlates working memory with learning and academic achievement. Working memory is a cognitive system that enables you to focus on a task while recalling stored knowledge and experiences.
“In neuroscience, there are theories about how working memory circuits drive learning, but nothing as specific as our rules that actually link the two, and that's one of the neat things we stumbled upon here. The theory led to the rule that in order to regulate each synapse individually, you need working memory alongside it,” Daruwalla says.
Daruwalla's design could usher in a new generation of AI that learns like humans. Not only would it make AI more efficient and accessible, it would mark something of a full-circle moment for neural AI. Long before ChatGPT uttered its first digital syllables, neuroscience has been providing AI with valuable data. Soon, AI may return the favor.
For more information:
Kyle Daruwala et al. “Hebbian learning rule based on information bottleneck naturally links working memory and synaptic updating” The forefront of computational neuroscience (2024). DOI: 10.3389/fncom.2024.1240348
Courtesy of Cold Spring Harbor Laboratory
Quote: Researchers develop new, more energy-efficient method for AI algorithms to process data (June 20, 2024) Retrieved June 23, 2024 from https://techxplore.com/news/2024-06-energy-efficient-ai-algorithms.html
This document is subject to copyright. It may not be reproduced without written permission, except for fair dealing for the purposes of personal study or research. The content is provided for informational purposes only.