Brain-inspired algorithms could significantly reduce AI’s energy usage

AI News


One of the big challenges facing artificial intelligence is the interaction between a computer's memory and its processing power. When the algorithm is running, data flows rapidly between these two components. However, AI models rely on vast amounts of data, which creates bottlenecks.

New research published Monday in the journal Frontiers in Science by Purdue University and Georgia Tech suggests a new approach to building computer architectures for AI models using brain-inspired algorithms. The researchers say that creating algorithms in this way could potentially reduce energy costs associated with AI models.

“The size of language processing models has increased 5,000 times over the past four years,” Kaushik Roy, a professor of computer engineering at Purdue University and lead author of the study, said in a statement. “This incredibly rapid expansion makes it critical to make AI as efficient as possible. This means fundamentally rethinking the way computers are designed.”


Don't miss our unbiased technical content and lab-based reviews. Add CNET As a preferred source on Google. Don't miss our unbiased technical content and lab-based reviews. Add CNET As a preferred source on Google.


Most modern computers are modeled on a 1945 idea called the von Neumann architecture, which separates processing and memory. This is where the deceleration occurs. As more people around the world rely on data-intensive AI models, differences in computer processing power and memory capacity may become more important.

IBM researchers pointed out this issue in a post earlier this year. The problem facing computer engineers is called the “memory wall.”

break down the wall of memory

Memory wall refers to the disparity between memory and processing power. Basically, your computer's memory is struggling to keep up with processing speed. This is not a new problem. Two researchers at the University of Virginia coined the term in the 1990s.

AI Atlas

CNET

But as AI becomes more pervasive, memory barriers are robbing the computers of time and energy that make AI models work. The researchers in this paper argue that they can try new computer architectures that integrate memory and processing.

Inspired by how our brains function, the AI ​​algorithm mentioned in the paper is known as a spiking neural network. A common criticism of these algorithms in the past has been that they can be slow and inaccurate. However, some computer scientists argue that these algorithms have seen significant improvements over the past few years.

Researchers suggest that AI models should take advantage of a concept related to SNNs known as in-memory computing. This concept is still relatively new in the field of AI.

“CIM offers a promising solution to the memory wall problem by integrating computing functionality directly into the memory system,” the authors write in the paper's abstract.

Medical devices, transportation, and drones are some of the areas that researchers think could be improved if computer processing and memory were integrated into a single system.

“AI is one of the most transformative technologies of the 21st century. But in order to move AI from data centers to the real world, its energy use must be dramatically reduced,” co-author Tanvi Sharma, a researcher at Purdue University, said in a statement.

“With less data transfer and more efficient processing, AI can now be housed in smaller, more affordable devices with longer batteries,” said Sharma.





Source link