糖心视频


Machine learning outpaces supercomputers for simulating galaxy evolution coupled with supernova explosion

Researchers use AI to speed up the processing time when simulating galaxy evolution
Comparison between simulations and ML for an isolated SN. Upper: the numerical simulation results. Lower: the numerical simulation results with our surrogate SN feedback model. Cross sections of snapshots at t = 0, 105, and 4 × 105 yr are given from left to right. Credit: The Astrophysical Journal (2025). DOI: 10.3847/1538-4357/add689

Researchers have used machine learning to dramatically speed up the processing time when simulating galaxy evolution coupled with supernova explosion. This approach could help us understand the origins of our own galaxy, particularly the elements essential for life in the Milky Way.

The findings are in The Astrophysical Journal.

The team was led by Keiya Hirashima at the RIKEN Center for Interdisciplinary Theoretical and Mathematical Sciences (iTHEMS) in Japan, along with colleagues from the Max Planck Institute for Astrophysics (MPA) and the Flatiron Institute.

Understanding how galaxies form is a central problem for astrophysicists. Although we know that powerful events like supernovae can drive galaxy evolution, we cannot simply look to the night sky and see it happen.

Scientists rely on numerical simulations that are based on large amounts of data collected from telescopes and other devices that measure aspects of interstellar space. Simulations must account for gravity and hydrodynamics, as well as other complex aspects of astrophysical thermochemistry.

On top of this, they must have a high temporal resolution, meaning that the time between each 3D snapshot of the evolving galaxy must be small enough so that critical events are not missed. For example, capturing the initial phase of supernova shell expansion requires a timescale of mere hundreds of years, which is 1,000 times smaller than typical simulations of interstellar space can achieve.

In fact, a typical supercomputer takes one to two years to carry out a simulation of a relatively small galaxy at the proper temporal resolution.

This animation shows the simulated evolution of a galaxy over 200 million years. While the simulations look very similar with and without the machine learning AI model, the AI model performed four times as fast, completing large scale simulation in a matter of months rather than years. Credit: RIKEN

Getting over this timestep bottleneck was the main goal of the new study. By incorporating AI into their data-driven model, the research group was able to match the output of a previously modeled dwarf galaxy but got the result much more quickly.

"When we use our AI model, the simulation is about four times faster than a standard numerical simulation," says Hirashima.

"This corresponds to a reduction of several months to half a year's worth of computation time. Critically, our AI-assisted simulation was able to reproduce the dynamics important for capturing and matter cycles, including star formation and galaxy outflows."

Like most models, the researchers' new model is trained using one set of data and then becomes able to predict outcomes based on a new set of data. In this case, the model incorporated a programmed neural network and was trained on 300 simulations of an isolated supernova in a molecular cloud that massed one million of our suns.

After training, the model could predict the density, temperature, and 3D velocities of gas 100,000 years after a . Compared with direct such as those performed by supercomputers, the new model yielded similar structures and history but took four times less time to compute.

According to Hirashima, "our AI-assisted framework will allow high-resolution star-by-star simulations of heavy galaxies, such as the Milky Way, with the goal of predicting the origin of the solar system and the elements essential for the birth of life."

Currently, the lab is using the new framework to run a Milky Way-sized galaxy simulation.

More information: Keiya 島敬也 Hirashima 平 et al, ASURA-FDPS-ML: Star-by-star Galaxy Simulations Accelerated by Surrogate Modeling for Supernova Feedback, The Astrophysical Journal (2025).

Journal information: Astrophysical Journal

Provided by RIKEN

Citation: Machine learning outpaces supercomputers for simulating galaxy evolution coupled with supernova explosion (2025, July 3) retrieved 4 July 2025 from /news/2025-07-machine-outpaces-supercomputers-simulating-galaxy.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Deep learning speeds up galactic calculations

36 shares

Feedback to editors