Researchers at the RIKEN Center for Interdisciplinary Theoretical and Mathematical Sciences (iTHEMS) in Japan have achieved a significant milestone by creating a hyper-realistic simulation of the Milky Way Galaxy. Collaborating with colleagues from the University of Tokyo and the Universitat de Barcelona, the team successfully modeled over 100 billion stars over a timespan of 10,000 years, surpassing previous simulations in both scale and speed.
This groundbreaking simulation is notable for representing 100 times more individual stars than earlier efforts, while also being produced 100 times faster. The researchers utilized a combination of 7 million CPU cores, advanced machine learning algorithms, and sophisticated numerical simulations to achieve this unprecedented level of detail. The findings were detailed in a paper titled “The First Star-by-star N-body/Hydrodynamics Simulation of Our Galaxy Coupling with a Surrogate Model,” published in the *Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis* (SC ’25).
Advancements in Astrophysics and Supercomputing
The simulation’s ability to capture the dynamics of individual stars allows scientists to test theories regarding galactic formation, structure, and evolution. For decades, astronomers have grappled with the complexity of accurately modeling galaxies due to the numerous forces at play, including gravity, fluid dynamics, supernovae, and the influence of supermassive black holes (SMBHs). Traditionally, the mass limit for simulations has been around one billion solar masses, representing less than 1% of the Milky Way’s stellar population.
Currently, state-of-the-art supercomputers require approximately 315 hours (over 13 days) to simulate just 1 million years of galactic evolution. This represents a mere 0.00007% of the Milky Way’s estimated age of 13.61 billion years. As a result, only large-scale events can be accurately replicated, and merely increasing computational power does not resolve the challenges faced.
To overcome these limitations, the research team employed an innovative approach using a machine learning surrogate model. This AI-driven shortcut does not consume resources used by the primary simulation model. Trained on high-resolution simulations of supernovae, the model can predict the impact of these explosions on surrounding gas and dust up to 100,000 years post-explosion. By integrating this AI model with physical simulations, the researchers achieved a simultaneous representation of both large-scale galactic dynamics and small-scale stellar phenomena.
Efficiency and Future Applications
The performance of the new simulation method was validated through extensive testing on the Fugaku and Miyabi Supercomputer Systems. Results indicated that the new approach could simulate star resolution in galaxies containing over 100 billion stars, with the capacity to model 1 million years of galactic evolution in just 2.78 hours. This efficiency suggests that an entire 1 billion years of galactic history could be simulated in approximately 115 days.
These advancements provide astronomers with a powerful tool for further investigating theories about galactic evolution and the formation of the universe. Additionally, the implementation of surrogate AI models demonstrates potential benefits for other complex simulations in various fields, such as meteorology, ocean dynamics, and climate science.
In summary, the successful creation of this hyper-realistic simulation represents a pivotal moment in astrophysics and computational science, paving the way for enhanced understanding of our galaxy and beyond.
