Phd Defense: “Towards an optimal analysis of the large-scale structure to understand the nature of dark energy” by Martin Karcher. The defense will take place in english.
Abstract:
At the end of the 20th century the observation of the accelerated expansion of the Universe marked a pivotal point in our understanding of the cosmos. With the observational evidence for dark energy in the form of a cosmological constant a treasure hunt has begun to unravel its true nature. In this work I studied how we can optimally analyse the large-scale distribution of galaxies in the Universe, orchestrated by the underlying theory of gravity in the context of the cosmological model, in order to shed light on dark energy and elucidate its elusive nature. In preparation for the precise measurements of the spatial distribution of galaxies with the Euclid satellite, I analysed several state-of-the-art models for the galaxy two-point correlation function (2PCF) in redshift space. This has been done on a dedicated set of large simulations. I used different metrics that are sensitive to the accuracy and precision of the recovered cosmological parameters. The analysis was carried out both using the template and full-shape fitting approaches. In both approaches the purely perturbative model using the effective field theory (EFT) formalism is not able to account for the very small scales at a redshift of z ≈ 1. Instead, models that describe the effect of non-coherent motions of galaxies in overdensities non-perturbatively are found to be superior. I found that the Gaussian streaming model augmented with effective counterterms allows reaching a similar good performance. Given the expected statistical precision by Euclid up to a few percent on the main cosmological parameters, I demonstrated that two models maintain an accuracy below this precision. I further investigated extended statistics that are more sensitive to potential modified gravity (MG) signatures, as they incorporate information beyond the standard 2PCF. I studied marked correlation functions, which consist of weighting the objects in the 2PCF by a mark function. The screening mechanism in MG that is necessary to recovery general relativity (GR) on astrophysical scales, imprints a fundamental environmental dependency that can be exploited by the mark function. I studied the non-trivial propagation of discreteness effects into the estimation of weighted statistics. I designed and implemented a methodology that can accurately recover the true signal as validated on high-density simulations. A new mark was proposed which efficiency relies on introducing anti-correlation between high and low density regions, as well as marks based on the large-scale environment of galaxies and tidal tensor. Overall, I found the most effective mark function to be based on the local density and that emphasises anti-correlation between high- and low-density regions. The latter uniquely produces significant deviations between MG and GR on scales up to 80 h^{-1}Mpc, both in real and redshift space. This makes the use of marked correlation functions with optimal marks very promising to detect MG features from the large-scale structure, even in the case of weak modifications to GR
Jury:
Guilaine Lagache (LAM) – President
Enrique Gaztañaga (University of Portsmouth) – Reviewer
Yann Rasera (Université Paris Cité) – Reviewer
Sandrine Codis (AIM, Paris) – Examiner
Martin White (UC Berkeley) – Examiner
Supervisors:
Stéphane Arnouts (LAM) – Thesis director
Julien Bel (CPT) – Co-director
Sylvain de la Torre (LAM) – Co-supervisor