Noise cleaning the precision matrix of short time series.

Physical review. E(2023)

引用 0|浏览4
暂无评分
摘要
We present a comparison between various algorithms of inference of covariance and precision matrices in small data sets of real vectors of the typical length and dimension of human brain activity time series retrieved by functional magnetic resonance imaging (fMRI). Assuming a Gaussian model underlying the neural activity, the problem consists of denoising the empirically observed matrices to obtain a better estimator of the (unknown) true precision and covariance matrices. We consider several standard noise-cleaning algorithms and compare them on two types of data sets. The first type consists of synthetic time series sampled from a generative Gaussian model of which we can vary the fraction of dimensions per sample q and the strength of off-diagonal correlations. The second type consists of time series of fMRI brain activity of human subjects at rest. The reliability of each algorithm is assessed in terms of test-set likelihood and, in the case of synthetic data, of the distance from the true precision matrix. We observe that the so-called optimal rotationally invariant estimator, based on random matrix theory, leads to a significantly lower distance from the true precision matrix in synthetic data and higher test likelihood in natural fMRI data. We propose a variant of the optimal rotationally invariant estimator in which one of its parameters is optimzed by cross-validation. In the severe undersampling regime (large q) typical of fMRI series, it outperforms all the other estimators. We furthermore propose a simple algorithm based on an iterative likelihood gradient ascent, leading to very accurate estimations in weakly correlated synthetic data sets.
更多
查看译文
关键词
precision matrix,noise
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要