Optimizing Large-Scale Structure Data Analysis With The Theoretical Error Likelihood

PHYSICAL REVIEW D(2021)

引用 29|浏览3
暂无评分
摘要
An important aspect of large-scale structure data analysis is the presence of non-negligible theoretical uncertainties, which become increasingly important on small scales. We show how to incorporate these uncertainties in realistic power spectrum likelihoods by an appropriate change of the fitting model and the covariance matrix. The inclusion of the theoretical error has several advantages over the standard practice of using the sharp momentum cut k(max). First, the theoretical error covariance gradually suppresses the information from the short scales as the employed theoretical model becomes less reliable. This allows one to avoid laborious measurements of k(max), which is an essential part of the standard methods. Second, the theoretical error likelihood gives unbiased constraints with reliable error bars that are not artificially shrunk due to overfitting. In realistic settings, the theoretical error likelihood yields essentially the same parameter constraints as the standard analysis with an appropriately selected k(max), thereby effectively optimizing the choice of k(max). We demonstrate these points using the large-volume N-body data for the clustering of matter and galaxies in real and redshift space. In passing, we validate the effective field theory description of the redshift space distortions and show that the use of the one-parameter phenomenological Gaussian damping model for fingers-of-God causes significant biases in parameter recovery.
更多
查看译文
关键词
structure,theoretical error likelihood,data,large-scale
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要