A research team from Tohoku University and Future University Hakodate has demonstrated that living biological neurons can be trained to perform supervised temporal pattern learning tasks that were previously performed by artificial systems. By integrating cultured neuronal networks into a machine learning framework, the research team showed that these biological systems can generate complex time-series signals, a major advance in both neuroscience and bio-inspired computing.
This study was published online Proceedings of the National Academy of Sciences (PNAS) In March 2026, we are focusing on new intersections between living neural systems and computational technologies. This finding suggests that biological neural networks (BNNs) may serve as a viable alternative or complement to existing machine learning models.
Artificial neural networks (ANN) and spiking neural networks (SNN) have long been used in machine learning and neuromorphic hardware. A framework known as reservoir computing has emerged as an efficient approach for processing time-dependent data by exploiting the dynamic properties of recursively connected ANNs and SNNs.
In traditional ANN-based reservoir computing, methods such as first-order reduced control error (FORCE) learning enable real-time adaptation by continuously adjusting the output signal in response to errors. These techniques allow artificial systems to generate a wide range of temporal patterns, including periodic and chaotic signals. However, whether a similar approach can be applied to biological neural networks remains an open question.
To address this gap, the researchers used cultured rat cortical neurons to build a biological neural network and incorporated it into a reservoir computing framework. By applying FORCE learning to optimize the system’s readout layer, the researchers were able to train a biological network to generate complex temporal signals comparable to those involved in motor control.
A key innovation in this study was the use of microfluidic devices to precisely guide neuron growth and control network connectivity. This approach allowed the researchers to create a modular network architecture that minimizes excessive synchronization, thereby facilitating the rich, high-dimensional dynamics required for effective reservoir computing.
Using this system, the BNN-based framework was able to generate a variety of time series patterns, including chaotic trajectories such as sinusoids, triangle waves, square waves, and even Lorentz attractors. In particular, this network demonstrated its flexibility by learning and stably reproducing sine waves with periods ranging from 4 to 30 seconds within the same system.
“This study shows that living neuron networks are not only biologically meaningful systems, but also have the potential to serve as new computational resources,” said Professor Hideaki Yamamoto of Tohoku University. “By bridging neuroscience and machine learning, we are paving the way for new forms of computing that exploit the unique dynamics of biological systems.”
Looking ahead, the research team aims to improve the stability of signal generation after training. Future work will focus on reducing the feedback delay and improving the FORCE learning algorithm. In parallel, this platform could be extended to microphysiological systems for studying drug responses and modeling neurological disorders, further expanding its impact in both scientific and medical fields.
This article has been reprinted from the following material: Note: Materials may be edited for length and content. Please contact the citation source for details. You can access our press release publishing policy here.
