Accelerate learning with brain-inspired AI design

Machine Learning


Mick Bonner, assistant professor of cognitive science at Johns Hopkins University, and his team have demonstrated that the biologically inspired architectural design of an artificial intelligence system can simulate human brain activity, even before training. Published in nature machine intelligencethis research challenges traditional AI development that prioritizes extensive deep learning and large-scale computational resources (often costing billions of dollars and thousands of megawatts of energy). Scientists found that modifying convolutional neural networks by changing three common network designs (transformers, fully connected networks, and convolutional networks) produced patterns of activity comparable to conventionally trained AI systems. This suggests that architectural design is a key factor in accelerating learning.

Biology-inspired AI architecture

Research from Johns Hopkins University shows that biologically inspired architectures can bring benefits to AI systems. in front Training begins. Scientists have found that changing the AI ​​architecture, particularly transformers, fully connected networks, and convolutional networks, affects how closely the response when exposed to images reflects brain activity in humans and primates. This challenges traditional approaches that rely on large datasets and computing power, and suggests that architectural design is a critical, but often overlooked, component of AI development.

When the researchers increased the number of artificial neurons in the network fully connected to the transformer, they observed little change in its response. However, by fine-tuning the architecture of the convolutional neural network, they obtained an activity pattern that closely simulated that of the human brain. These untrained convolutional networks perform on par with traditional trained AI, which typically requires exposure to millions or billions of images, demonstrating the importance of architectural design over vast amounts of data.

The findings suggest that starting with the “right blueprint”—one based on biological principles—could dramatically accelerate the learning of AI systems. The research team is currently focusing on developing biologically modeled learning algorithms with the aim of creating a new deep learning framework. This research supports the idea that evolution may have optimized the design of the brain to increase efficiency, and that AI could benefit from incorporating these principles.

AI network design comparison

Researchers at Johns Hopkins University compared three common AI network designs: transformers, fully connected networks, and convolutional networks to understand how architecture affects performance. The researchers modified these blueprints to create a number of artificial neural networks, tested their responses to images, and compared their activity to brain patterns in humans and primates. Our findings show that increasing the number of transformers and artificial neurons in fully connected networks yields minimal changes, whereas in convolutional networks; did Indicates changed activity.

Specifically, by fine-tuning the convolutional network architecture, researchers are now able to generate activity patterns that are comparable to traditional activity patterns. trained AI system. These traditional systems typically require exposing millions or billions of images. This suggests that architectural design, rather than just extensive training data, plays a key role in achieving brain-like AI capabilities. This research challenges current approaches of massive data input and resource allocation for AI development.

The findings of this study suggest that a well-designed architectural blueprint can provide a favorable starting point for AI learning. The researchers believe that incorporating insights from biology in parallel with an optimized architecture could dramatically accelerate the learning process. The team is currently focused on developing learning algorithms modeled on biological systems to inform new deep learning frameworks, based on their understanding of the effects of network design.

There may be good reasons why evolution converged on this design. Our research suggests that a more brain-like architectural design puts AI systems at a very advantageous starting point.

Mick Bonner

Accelerate AI learning through design

Johns Hopkins University researchers have discovered that biologically inspired architectural design could provide a powerful springboard for AI systems in front It has the potential to accelerate training and learning. By focusing on network blueprints, the team challenged traditional approaches that relied on large datasets and extensive computing power (billions in costs and thousands of megawatts in usage). Their works are nature machine intelligencesuggests that the way AI is designed is critical, reflecting how humans learn with limited data.

Scientists modified three common AI network designs: transformers, fully connected networks, and convolutional networks to build dozens of unique artificial neural networks. Increasing the number of transformers or artificial neurons in a fully connected network made little difference, but by fine-tuning the convolutional network, researchers were able to generate comparable activity patterns. not trained AI system. These networks responded similarly to brain activity in humans and primates, even without exposure to millions of images.

This finding suggests that starting with the right architectural blueprint can dramatically accelerate AI learning. According to lead author Mick Bonner, large-scale data training only The key element is that architectural changes alone cannot create brain-like AI. The team is currently building on these initial findings about the importance of design to develop biologically inspired learning algorithms that may inform new deep learning frameworks.



Source link