Machine learning surpasses supercomputers to simulate the evolution of galaxies, and supernova explosions

Machine Learning


Researchers use AI to speed up processing times when simulating galaxy evolution

Comparison of ML with isolated SN simulations. Above: Numerical simulation results. Low: Numerical simulation results using surrogate SN feedback models. Snapshot sections of t = 0, 105and 4×105 Years are given from left to right. credit: Astrophysical Journal (2025). doi:10.3847/1538-4357/add689

Researchers have used machine learning to dramatically speed up processing times when simulating galaxy evolution and simulating supernova explosions. This approach helps us understand the origins of our own galaxies, especially the essential elements of life in the Milky Way.

The survey results are published in Astrophysical Journal.

The team was led by Hiroshima Keiya of the Riken Center for Japan's Interdisciplinary Theory and Mathematical Sciences (ITHEMS), along with colleagues from Max Planck Astrophysics (MPA) and Flatiron Institute.

Understanding how galaxies form is a central issue for astrophysicists. We know that powerful events like supernovae can promote the evolution of galaxies, but we can't look to the night sky and see it happen.

Scientists rely on numerical simulations based on a large amount of data collected from telescopes and other devices measuring aspects of interstellar space. Simulations need to explain other complex aspects of gravity and fluid mechanics, as well as astrophysical chemistry.

In addition to this, they must have a high temporal resolution. In other words, the time between each 3D snapshot of an evolving galaxy must be small enough to avoid missing important events. For example, a timescale of hundreds of years is required to capture the early stages of a supernova shell expansion. This could be achieved 1,000 times more than a typical simulation of interstellar space.

In fact, a typical supercomputer takes 1-2 years to perform simulations of relatively small galaxies with appropriate time resolution.







This animation shows the evolution of galaxy simulations over 200 million years. Simulations appear to be very similar to machine learning AI models, but the AI ​​models run 4 times faster, completing large-scale simulations in months rather than years. Credit: Riken

Overcoming this time step bottleneck was the main goal of the new research. By incorporating AI into a data-driven model, the research group was able to match the outputs of previously modeled dwarf galaxies, but they got results more quickly.

“With AI models, simulations are about four times faster than standard numerical simulations,” says Hirashima.

“This corresponds to reducing computational time from several months to six months. Critical, our AI-assisted simulations were able to replicate important dynamics for capturing galaxy evolution and material cycles, such as star formation and galaxy outflow.”

Like most machine learning models, researchers' new models are trained using one dataset to allow them to predict results based on the new dataset. In this case, the model incorporates a programmed neural network and was trained on 300 simulations of isolated supernovas of molecular clouds, carrying a large amount of one million suns.

After training, the model was able to predict gas density, temperature, and 3D speeds 100,000 years after the supernova explosion. Compared to direct numerical simulations, such as those performed by supercomputers, the new model produced similar structures and star formation history, but the calculation took four times longer.

According to hirashima, “Our AI-assisted framework allows for high-resolution Starby Star simulations of heavy galaxies such as the Milky Way, with the aim of predicting the origins of the solar system and the essential elements of the birth of life.”

Currently, the lab is using a new framework to perform Milky Way-sized galaxy simulations.

detail:
Keiya Shima Hirashimaet al, Asura-FDPS-ML: Galaxy simulation of Starbystar accelerated by Superogate modeling, Astrophysical Journal (2025). doi:10.3847/1538-4357/add689

Quote: Machine learning outweighs supercomputers for simulating galactic evolution in conjunction with supernova explosions (July 3, 2025) obtained from July 6, 2025 from https://phys.org/news/2025-07-machine-cupcessspaces-supercomputers-simulating-galaxy.htmll.

This document is subject to copyright. Apart from fair transactions for private research or research purposes, there is no part that is reproduced without written permission. Content is provided with information only.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *