An international team of astronomers recently conducted the largest cosmological computer simulation to date, providing insights into the evolution of both ordinary and dark matter in our universe. Known as the FLAMINGO simulations, these calculations track the progress of different components of the universe, including ordinary matter, dark matter, and dark energy, based on the principles of physics. As the simulations run, virtual galaxies and clusters of galaxies form. Three papers have been published in the Monthly Notices of the Royal Astronomical Society, which detail the methods used, present the simulations, and analyze how well the simulations replicate the universe’s large-scale structure.
Observatories like the Euclid Space Telescope and NASA’s James Webb Space Telescope amass vast quantities of data on stars, quasars, and galaxies. Simulations like FLAMINGO play a crucial role in interpreting this data scientifically by bridging the gap between predictions from our theories of the universe and observed data. According to the theory, certain “cosmological parameters” (typically six in the simplest version of the theory) set the properties of our entire universe. The values of these parameters can be precisely measured using various methods. One such method focuses on the cosmic microwave background (CMB), a faint glow that remains from the early stages of the universe. However, these values don’t align with measurements obtained by other techniques based on gravitational lensing, a phenomenon caused by the gravitational force of galaxies. These inconsistencies, known as “tensions,” could challenge the standard model of cosmology, the cold dark matter model. By providing insights into potential biases in measurements, the computer simulations may help identify the cause of these tensions. If no explanations for the tensions are found, it could spell trouble for the theory.
Currently, the computer simulations used for comparison with observations only account for cold dark matter. However, as research leader Joop Schaye from Leiden University points out, the contribution of ordinary matter cannot be ignored, despite dark matter’s prevailing influence on gravity. Ordinary matter might mitigate the disagreement between models and observations. Initial results suggest that accurate predictions require the inclusion of both neutrinos and ordinary matter, but these simulations have not yet resolved the tensions between different cosmological observations. Simulating baryonic matter (ordinary matter) proves highly challenging and computationally demanding. While ordinary matter constitutes only 16% of the universe’s matter, it is subject to not only gravity but also gas pressure. This pressure can expel matter from galaxies into intergalactic space due to the effects of active black holes and supernovae. Predicting the strength of these intergalactic winds, which depend on explosions in the interstellar medium, presents a significant challenge. Additionally, the contribution of neutrinos, subatomic particles with a small but unknown mass, has not yet been simulated.
To address these complexities, the researchers conducted a series of computer simulations that tracked the formation of structures in dark matter, ordinary matter, and neutrinos. Ph.D. student Roi Kugel from Leiden University explains that machine learning was employed for calibrating the impact of galactic winds. This involved comparing the predictions of various simulations with different volumes against observations of galaxy masses and gas distribution in galaxy clusters. The team simulated the most fitting model, based on calibration observations, across various cosmic volumes and resolutions. They also varied the model’s parameters in simulations of slightly smaller, yet still significant, volumes. The largest simulation involved 300 billion resolution elements (particles of the mass of a small galaxy) within a cubic volume measuring ten billion light-years. This simulation is touted as the largest-ever completed cosmological computer simulation featuring ordinary matter. Making this simulation possible required the development of a new code called SWIFT, which used 30,000 CPUs to distribute the computational work efficiently.
The FLAMINGO simulations open a virtual window into the universe, enabling a better understanding of cosmological observations. The abundance of virtual data not only creates new opportunities for theoretical discoveries but also facilitates testing new data analysis techniques, including machine learning. By employing machine learning, astronomers can make predictions for random virtual universes and compare them to observations of large-scale structures. This analysis allows for the precise measurement of cosmological parameters and their corresponding uncertainties, taking into account the impact of galactic winds. Further information regarding the FLAMINGO project can be found in the following papers from the Monthly Notices of the Royal Astronomical Society: Joop Schaye et al., “The FLAMINGO project: cosmological hydrodynamical simulations for large-scale structure and galaxy cluster surveys” (DOI: 10.1093/mnras/stad2419), Roi Kugel et al., “FLAMINGO: Calibrating large cosmological hydrodynamical simulations with machine learning” (DOI: 10.1093/mnras/stad2540), and Ian G McCarthy et al., “The FLAMINGO project: revisiting the S8 tension and the role of baryonic physics” (DOI: 10.1093/mnras/stad3107).