Looking for cracks in the standard cosmological model
New computer simulations follow the formation of galaxies and the cosmic large-scale structure with unprecedented statistical precision
An international team of astrophysicists led by researchers from the Max Planck Institute for Astrophysics in Germany, Harvard University in the USA, and Durham University in the UK has presented an ambitious attempt to jointly simulate the formation of galaxies and cosmic large-scale structure throughout staggeringly large swaths of space. Their simulations now also take the ghostly neutrino particles into account and could help to constrain their mass. First results of their “MillenniumTNG” project have just been published in a series of 10 articles in the journal Monthly Notices of the Royal Astronomical Society. The new calculations help to subject the standard cosmological model to precision tests and to unravel the full power of upcoming new cosmological observations.
Over the past decades, cosmologists have gotten used to the perplexing conjecture that the universe’s matter content is dominated by enigmatic dark matter and that an even stranger dark energy field acts as some kind of anti-gravity to accelerate the expansion of today’s cosmos. Ordinary baryonic matter makes up less than 5% of the cosmic mix, but this source material forms the basis for the stars and planets of galaxies like our own Milky Way. This seemingly strange cosmological model is known under the name LCDM. It provides a stubbornly successful description of a large number of observational data, ranging from the cosmic microwave radiation – the rest-heat left behind by the hot Big Bang – to the “cosmic web”, where galaxies are arranged along an intricate network of dark matter filaments. However, the real physical nature of dark matter and dark energy is still not understood, prompting astrophysicists to search for cracks in the LCDM theory. Identifying tensions with observational data could lead to a better understanding of these fundamental puzzles about our Universe. Sensitive tests are required that need both: powerful new observational data and more detailed predictions for what the LCDM model actually implies.
Scientists at the Max Planck Institute for Astrophysics (MPA), together with an international team of researchers at Harvard University and Durham University, as well as York University in Canada and the Donostia International Physics Center in Spain have now managed to take a decisive step forward on the latter challenge. Building on their previous successes with the “Millennium” and “IllustrisTNG” projects, they developed a new suite of simulation models dubbed “MillenniumTNG”, which trace the physics of cosmic structure formation with considerably higher statistical accuracy than was possible with previous calculations.
Large simulations including new physical details
The team utilized the advanced cosmological code GADGET-4, custom-built for this purpose, to compute the largest high-resolution dark matter simulations to date, covering a region nearly 10 billion light-years across. In addition, they employed the moving-mesh hydrodynamical code AREPO to follow the processes of galaxy formation directly throughout volumes still so large that they can be considered representative of the universe as a whole. Comparing the two types of simulations allows a precise assessment of the impact of baryonic processes related to supernova explosions and supermassive black holes on the total matter distribution. An accurate knowledge of this distribution is key for interpreting upcoming observations correctly, such as so-called weak gravitational lensing effects, which respond to matter irrespective of whether it is of dark or baryonic type.
Furthermore, the team included massive neutrinos in their simulations, for the first time in simulations big enough to allow meaningful cosmological mock observations. Previous cosmological simulations had usually omitted them for simplicity, because they make up at most 1-2% of the dark matter mass, and since their nearly relativistic velocities mostly prevent them from clumping together. Now, however, upcoming cosmological surveys (such as those of the recently launched Euclid satellite of the European Space Agency) will reach a precision allowing a detection of the associated percent-level effects. This raises the tantalizing prospect of constraining the neutrino mass itself, a profound open question in particle physics, so the stakes are high.
For the groundbreaking MillenniumTNG simulations, the researchers made efficient use of two extremely powerful supercomputers, the SuperMUC-NG machine at the Leibniz Supercomputing Centre (LRZ) in Garching, and the Cosma8 machine hosted by Durham University on behalf of the UK’s DiRAC High-Performance Computing facility. More than 120 000 computer cores toiled away for nearly two months at SuperMUC-NG, using computing time awarded by the German Gauss Centre for Supercomputing (GCS), to produce the most comprehensive hydrodynamical simulation model to date. MillenniumTNG tracks the formation of about one hundred million galaxies in a region of the universe around 2400 million light-years across (see Figure 1). This calculation is about 15 times bigger than the previously best is this category, the TNG300 model of the IllustrisTNG project.
Using Cosma8, the team computed an even bigger volume of the universe, filled with more than a trillion dark matter particles and more than 10 billion particles for tracking massive neutrinos (see Figure 2). Even though this simulation did not follow the baryonic matter directly, its galaxy content can be accurately predicted in MillenniumTNG with a semi-analytic model that is calibrated against the ‘full physics’ calculation of the project. This procedure leads to a detailed distribution of galaxies and matter in a volume that for the first time is large enough to be representative for the universe as a whole, putting comparisons to upcoming observational surveys on a sound statistical basis.
Theoretical predictions for cosmology
The first results of the MillenniumTNG project show a wealth of new theoretical predictions that reinforce the importance of computer simulations in modern cosmology. The team has written and submitted ten introductory scientific papers for the project. Eight of them have just appeared simultaneously in the journal MNRAS, the remaining two are about to follow shortly.
One of the studies had a look at the shapes of galaxies. Nearby galaxies have the subtle tendency to orient their shapes in similar directions instead of pointing randomly, an effect called “intrinsic galaxy alignments”. This poorly understood effect distorts inferences based on weak gravitational lensing, which creates its own statistical alignment signal. The MillenniumTNG project could, for the first time, measure intrinsic alignments with very high signal-to-noise directly from the shapes of the simulated galaxies, out to distances of several hundred million light-years. “Perhaps our determination of the intrinsic alignment of galaxy orientations can help to resolve the current discrepancy between the amplitude of matter clustering inferred from weak lensing and from the cosmic microwave background“, says PhD-student Ana Maria Delgado, first author of this study by the MillenniumTNG team. Using these results, astronomers will be able to correct for this important systematic effect much better.
Another timely result refers to the recent discovery of a population of very massive galaxies in the young universe with the James Webb Space Telescope. The masses of these galaxies are unexpectedly large just a brief time after the Big Bang, seemingly defying theoretical expectations. Dr. Rahul Kannan analyzed the predictions of MillenniumTNG for this early epoch. While the simulations agree with the observations out to redshifts of z=10 (when the universe was less than 500 million years old), he confirmed that, if they hold up, the new results by JWST at even higher redshift will be in conflict with the simulation predictions. “Perhaps star formation is much more efficient shortly after the Big Bang than at later times, or maybe massive stars are formed in higher proportions back then, making these galaxies unusually bright”, explains Dr. Kannan.
Other works of the team’s initial analysis focus on the clustering signals of galaxies. For example, MPA PhD student Monica Barrera produced extremely large and highly realistic mock catalogues of galaxies on the past backwards “lightcone” of a fiducial observer (see Figure 3). In this case, galaxies that are more distant are also automatically younger, reflecting the travel time of the light that is reaching our telescopes. Using these virtual observations, she looked at the so-called baryonic acoustic oscillation (BAO) feature (which provides a cosmologically important standard ruler) in the projected two-point correlation function of galaxies. Her results showed, that measuring these BAOs is a fairly tricky endeavour that can be significantly influenced by so-called cosmic variance effects – even when extremely large volumes are studied in observational surveys. While in simulations one can observe the modelled universe from different vantage points to recover the correct statistical ensemble average, this is unfortunately not readily possible for the real Universe. “The MillenniumTNG simulations are so big and contain so many galaxies, more than 1 billion in the biggest calculation, that it was really hard to study them”, says Monica Barrera. “Analysis scripts that work just fine for smaller simulations tend to take forever for MillenniumTNG.”
Analyzing cosmological data
The flurry of first results from the MillenniumTNG simulations make it clear that they will be of great help to design better strategies for the analysis of upcoming cosmological data. The team’s principal investigator, Prof. Volker Springel from MPA argues that “MillenniumTNG combines recent advances in simulating galaxy formation with the field of cosmic large-scale structure, allowing an improved theoretical modelling of the connection of galaxies to the dark matter backbone of the Universe. This may well prove instrumental for progress on key questions in cosmology, such as how the mass of neutrinos can be best constrained with large-scale structure data.” The MillenniumTNG simulations produced more than 3 Petabytes of simulation data, forming a rich asset for further research that will keep the participating scientists busy for many years to come.