Carnegie Mellon University: Weak Gravitational Lensing Tests the Cosmological Model
An international team of cosmologists and astrophysicists, led by Princeton University and the astronomical communities of Japan and Taiwan, and including researchers from Carnegie Mellon University, used precision measurements of the cosmological model of the universe to find that the universe is slightly less “clumpy” than it should be based on the cosmological standard model. Their findings, which could lead to a better understanding of dark matter, use data from the Hyper Suprime-Cam’s year 3 results and are contained in a series of five papers, available on arXiv.
Dark energy and dark matter make up 95% of the universe. Since dark matter can’t be seen, it can’t be measured directly. Instead, researchers must derive information by measuring its effects on other visible objects like galaxies and stars.
One way this is done is by measuring a phenomenon called weak gravitational lensing. As the universe has expanded since the Big Bang, dark matter and galaxies have been drawn together by gravity, resulting in a clumpy distribution of matter throughout the universe. These clumps of matter exert a gravitational pull that bends light as it travels from distant galaxies toward Earth. As a result, when galaxies are observed by telescopes, the resulting images are slightly distorted. By measuring these distortions, researchers can learn more about the distribution of matter in the universe and the nature of dark matter and dark energy.
The new papers use data from the Hyper Suprime-Cam (HSC) sky survey, a wide-field imaging survey carried out by Japan’s 8.2-meter Subaru Telescope on the summit of Maunakea in Hawaii. The data set includes the measurements for 25 million galaxies as they appeared billions of years ago. With measurements from so many galaxies, researchers were able to create a very precise analysis of weak gravitational lensing using a combination of sophisticated computer simulations and observations from the HSC.
They found that the value for the clumpiness of the universe’s dark matter, a number referred to as S8, to be 0.78. While this number aligns with what other recent gravitational lensing surveys have found, it does not align with the S8 value of 0.83 derived from the radiation emitted in the earliest days of the universe called the cosmic microwave background (CMB).
The results suggest that the differences between these two numbers may not be coincidental. It could indicate that there is an unrecognized error in one of the two measurements or that the standard cosmological model, called the Lambda Cold Dark Matter Model, might be incomplete.
“The HSC weak lensing group has done a meticulous job of ensuring that our weak lensing results are robust, and there is about a 5% chance that the results disagree with the CMB only by chance,” said Rachel Mandelbaum(opens in new window), professor of physics and member of the McWilliams Center for Cosmology(opens in new window) at Carnegie Mellon, and a member of the HSC collaboration. “It will be important to confirm this result with future data sets that can make the measurement even more precisely and to continue to refine our understanding of potential systematic biases. But this result is a tantalizing hint of potential physics beyond the Lambda Cold Dark Matter cosmological paradigm.”
Three different analysis techniques were used on the HSC weak gravitational lensing data. The development and validation of the data catalog was led by Xiangchong Li while he was a doctoral student at the University of Tokyo. The analyses were blinded, so the researchers couldn’t compare results with each other or even view their results until they had finished all their sanity checks on the analysis. After revealing the results, they were ecstatic to see that all methods yielded the same conclusions about S8.
Li, who is now a Carnegie Mellon postdoctoral fellow working with Mandelbaum, led the real space analysis. This analysis established how the images of galaxies have been lensed by matter, including dark matter, by measuring the correlations of galaxy shapes from different time points.
Other papers used Fourier space analysis, which maps galaxy shapes and measures the power spectrum of the dark matter density field in Fourier space, and 3x2pt analysis, which constrains the cosmological constant by combining the galaxy shape data collected by HSC with the BOSS density distribution of foreground galaxies.
“Real space and Fourier space analyses are sensitive to the information of the matter distribution at different scales, and they have different responses to systematic errors. Doing two independent blinded analyses is an important test to validate the robustness of the cosmology constraint,” said Li. “3x2pt analysis includes observables from BOSS galaxy density distribution, which provides independent information to the measurement.”
Much of the analysis of the HSC data relied on methods developed by Tianqing Zhang, a physics graduate student at Carnegie Mellon. One was a statistically principled method(opens in new window) to propagate the uncertainty in redshift (or distance) measurements of the HSC galaxies. The second method(opens in new window) establishes the impact of the point spread function, which describes the combined effect of atmospheric turbulence and telescope optics and detector on weak lensing observations.
“Our work is the ‘last line of defense’ to shield the cosmological results from the impact of the variables included in point spread function,” said Zhang. “Although the HSC enjoys one of the best atmospheric conditions on planet Earth and is equipped with a state-of-the-art optical and detector system, this problem is still a big challenge.”