Quantum entanglement with machine learning enables high-precision Renyi entropy estimation of large 3D lattices

Machine Learning


Calculating quantum entanglement in complex systems often relies on complex mathematical techniques and is a formidable challenge for physicists. Andrea Bulgarelli, Elia Cellini and Carl Jansen, together with colleagues from institutions such as the University of Bonn, the University of Edinburgh and the Cyprus Institute, are demonstrating powerful new approaches using deep learning. Their work significantly improves traditional Monte Carlo simulations, allowing surprisingly accurate calculations of entanglement in three-dimensional systems, even for very large and complex structures. This breakthrough not only enhances our ability to study fundamental aspects of quantum field theory, but also introduces a new way to investigate lattice defects using advanced flow-based sampling techniques, opening exciting new avenues for research in this field.

The team is developing new methods to quantify and characterize these correlations, focusing on scenarios where traditional calculations become impractical. By employing advanced numerical techniques, we extend the scope of entanglement calculations to larger systems and more realistic physical conditions, bridging the gap between theoretical predictions and experimental observations. They introduce a new methodology to efficiently calculate Lenian entropy, a key measure of entanglement, investigate the relationship between entanglement and other physical properties such as energy transport and thermal conductivity, reveal how quantum correlations influence the behavior of complex quantum materials, and pave the way for new quantum technologies.

Sampling entanglement entropy with stochastic normalized flow

In this work, we introduce a new approach to computing entanglement entropy in quantum field theory by utilizing a machine learning technique called normalized flow (and stochastic normalized flow, SNF). Their core innovation involves using SNF to efficiently sample the configurations involved in computing entanglement entropy, unlike traditional equilibrium simulations. By employing a deep generative model, the team outperformed standard Monte Carlo algorithms, particularly by taking advantage of a new defect-coupling layer. This method effectively focuses the model on relevant parts of the lattice, reduces training time, and reveals the ability to learn fundamental local geometric features. While acknowledging the potential for other methods to become more competitive with theories containing significantly more degrees of freedom, the authors suggest integrating defect-coupled layers into such methods to further improve performance, and suggest future work investigating other entanglement-related quantities and diverse physical settings.

Realization of entanglement calculation by flow normalization

Researchers have achieved breakthroughs in computing quantum entanglement using deep generative models, particularly normalized flows (NFs). This new approach significantly outperforms standard Monte Carlo algorithms and allows accurate estimation of Reni entropy in three dimensions for very large lattices. The core innovation lies in a new method for studying lattice defects using flow-based sampling, allowing direct calculation of partition function ratios, a key step in quantifying entanglement. Importantly, we reduced the computational load by concentrating the transformation on a small subset of lattice degrees of freedom near the “defect” and achieved significant efficiency improvements compared to other flow-based samplers such as nonequilibrium Monte Carlo (NEMC) and stochastic normalized flow (SNF).



Source link