A research team from the Kavli Institute’s Physics and Mathematics of the Universe has developed a simulation for three-dimensional galaxy distribution analysis of the Universe.
By applying a machine-learning technique – which is a neural network method – to gigantic amounts of simulation data about the formation of cosmic structures in the Universe, a team of researchers have developed a very fast and highly efficient software program that can make theoretical predictions about structure formation. By comparing model predictions to actual observational datasets, the team succeeded in accurately measuring cosmological parameters, reports a study in Physical Review D.
When the biggest Galaxy survey to date in the world, the Sloan Digital Sky Survey (SDSS), created a three-dimensional map of the Universe via the observed distribution of galaxies, it became clear that galaxies had certain characteristics. Some would clump together, or spread out in filaments, and in some places, there were voids where no galaxies existed at all. These demonstrate that galaxies did not evolve in a uniform way; they formed as a result of their local environment.
Observing the three-dimensional map of galaxies
In general, researchers agree that this non-uniform distribution of galaxies is because of the impacts of gravity caused by the distribution of ‘invisible’ dark matter, the mysterious matter that no one has yet directly observed. By studying the data in the three-dimensional map of galaxies in detail, the research team were able to uncover the fundamental quantities, such as the amount of dark matter in the Universe.
Recently, N-body simulations have been widely utilised in studies to recreate the formation of cosmic structures in the Universe. These simulations mimic the initial inhomogeneities at high redshifts by a large number of N-body particles, which effectively represent dark matter particles and then simulate how dark matter distribution evolves over time. This is achieved by computing gravitational pulling forces between particles in an expanding Universe. However, the simulations are usually expensive, taking tens of hours to complete on a supercomputer, even for one cosmological model.
This team of researchers was led by Project Researcher Yosuke Kobayashi, a former scientist at the Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU) – who is currently a Postdoctoral Research Associate at The University of Arizona – and Professor Masahiro Takada, Kavli IPMU, as well as Kavli IPMU Visiting Scientists Takahiro Nishimichi and Hironao Miyatake.
Generating theoretical calculations of the power spectrum
Scientists combined machine learning with numerical simulation data utilising the supercomputer ‘ATERUI II’ at the National Astronomical Observatory of Japan (NAOJ) to generate theoretical calculations of the power spectrum. This is the most fundamental quantity measured from Galaxy surveys, which tells researchers statistically how galaxies are distributed in the Universe.
Typically, several millions of N-body simulations would need to be run. However, the research team were able to utilise machine learning to teach their program to calculate the power spectrum at the same level of accuracy as a simulation. This was applicable even for a cosmological model, which had not been done before. This technology is known as an emulator and is already being utilised in computer science fields outside of astronomy.
“By combining machine learning with numerical simulations, which cost a lot, we have been able to analyse data from astronomical observations with high precision. These emulators have been used in cosmology studies before, but hardly anyone has been able to consider the numerous other effects, which would compromise cosmological parameter results using real Galaxy survey data. Our emulator does and has been able to analyse real observation data. This study has opened up a new frontier to large-scale structural data analysis,” explained Kobayashi, lead author.
Galaxy bias uncertainty
However, to apply the emulator to actual Galaxy survey data, the team had to consider ‘Galaxy bias’ uncertainty. This is causes ambiguity, as researchers cannot accurately predict where galaxies form in the Universe, because of the complicated physics inherent in Galaxy formation. To overcome this obstacle, the research team focused on simulating the distribution of dark matter ‘halos,’ where there is a high density of dark matter and a high probability of galaxies forming. Scientists succeeded in making a flexible model prediction for a given cosmological model, by introducing a sufficient number of ‘nuisance’ parameters to consider the Galaxy bias uncertainty.
Following this, the research team compared the model prediction to an SDSS data set, and successfully measured cosmological parameters to high precision. It confirms as an independent analysis that only approximately 30% of all energy comes from matter (mainly dark matter) and that the remaining 70% is the result of dark energy causing the accelerated expansion of the Universe.
Measuring the ‘clumpiness’ of the Universe
Additionally, scientists succeeded in measuring the clumpiness of matter in the Universe, while the conventional method utilised to analyse the Galaxy 3D maps was not able to determine these two parameters simultaneously. The precision of the research team’s parameter measurement exceeds that obtained by the previous analyses of Galaxy surveys. These results demonstrate the effectiveness of the emulator developed in this study.
Next, scientists intend to continue studying dark matter mass and the nature of dark energy by applying their emulator to Galaxy maps that will be captured by the Prime Focus Spectrograph, which is currently under development, and being led by the Kavli IPMU, to be mounted on NAOJ’s Subaru Telescope.