Researchers enhance spike neural networks with spike match-dependent plasticity beyond STDP accuracy

Machine Learning


Spike neural networks provide a more realistic and energy-efficient approach to artificial intelligence, but developing effective learning rules for these networks remains a critical issue. Saptarshi Beja and colleagues from the University of Manchester together with collaborators present Spike Agreement-dependent Plasticity (SADP). This is a new learning rule that focuses on the overall consensus between the firing patterns of connected neurons rather than relying on the exact timing of individual spikes. This approach, which utilizes correlation measures, not only improves the accuracy of pattern recognition for image datasets such as MNIST and Fashion Mist, but also provides a significant speed advantage over traditional learning rules. By bridging the gap between biological realism and computational efficiency, SADP represents a promising step towards scalable and practical spiked neural networks for advanced machine learning applications.

Instead of relying on the exact timing between individual spikes, SADP focuses on the consensus across the spikes, providing a more robust and scalable approach to learning. This innovative method employs statistical measurements to determine synaptic weight updates, effectively capturing collective behavior of neural groups in noisy environments and addressing challenges. SADP update rules achieve linear time complexity, greatly improve computational efficiency, and allow practical hardware implementations.

Scientists have used experimental data from the Iontronic Organic Memo Transformer to create specialized kernels to further improve SADP performance. Experiments on image classification datasets show that SADP outweighs classic STDP in both accuracy and runtime, and also works effectively for spike timing and environmental noise variations. Researchers recognized that traditional STDP models focusing on pairwise spike interactions often fail to capture the complexity of biological learning. The team pioneered a shift towards a more global plasticity model and acknowledged the importance of neuronal synchronization, a coordinated firing of neuronal groups as key drivers of synaptic change. Instead of relying on the exact timing between pre-synaptic and post-synaptic spikes, SADP focuses on the agreement across spike trains, providing a more robust and scalable approach to learning. This innovative method employs population-level correlation metrics to determine synaptic weight updates and effectively captures collective behaviors of neuronal groups. SADP update rules achieve linear time complexity, greatly improve computational efficiency, and enable practical hardware implementation through bitwise logic. Scientists leveraged experimental data from the Iontronic Organic Memo Transformer device to derive a spline-based kernel, further improving SADP performance. SADP focuses on consensus between spikes, spike train patterns rather than relying on the exact timing between individual spikes, providing a more robust and biologically plausible approach to synaptic learning. This innovative rule uses statistical metrics to quantify the alignment of pre-synaptic and post-synaptic activity, disengages from strict causal requirements and accepts correlational sensitivity learning. The team shows that SADP achieves linear time complexity and achieves more significant improvements than STDP's quadratic complexity, and is particularly suited to efficient hardware implementations.

Experiments conducted on the challenging image classification datasets Mnist and Fashion-Mnist reveal that SADP consistently outperforms classical STDP in both the accuracy and runtime of various network configurations and encoding schemes, capturing a wider pattern of neural activity, making it less susceptible to noise. Additionally, the researchers have created a spline-based kernel that incorporates device-specific data from an experimental Iontronic Organic Memo Transformer device to enhance SADP performance and compatibility with new neural morphological hardware. The team demonstrates that SADP achieves competitive accuracy and training time faster compared to both Hevian and STDP learning rules, especially when it provides a pathway for integrating advances in physical devices and algorithms, using splines or linear kernels with rate coding schemes. This finding establishes SADP as a promising and efficient alternative to existing SNN training methods, providing a balance between theoretical performance and practical implementation. Although the current work focuses on shallow networks, the authors acknowledge that extending SADP to a deeper architecture is a challenge and requires mechanisms to maintain signal integrity between layers. Future research will explore adaptive kernel learning, reward adjustment variants of SADP, and attention-based consensus mechanisms to promote deeper, more flexible learning architectures.

👉Details
🗞 Spike Agreement-dependent Plasticity: A Scalable Bioinspiration Learning Paradigm for Spike Neural Networks
🧠arxiv: https://arxiv.org/abs/2508.16216



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *