A Bayesian Bootstrap for Mixture Models
Bayesian Analysis(2024)
The University of Texas at Austin Department of Statistics and Data Sciences
Abstract
This paper proposes a new nonparametric Bayesian bootstrap for a mixture model, by developing the traditional Bayesian bootstrap. We first reinterpret the Bayesian bootstrap, which uses the P\'olya-urn scheme, as a gradient ascent algorithm which associated one-step solver. The key then is to use the same basic mechanism as the Bayesian bootstrap with the switch from a point mass kernel to a continuous kernel. Just as the Bayesian bootstrap works solely from the empirical distribution function, so the new Bayesian bootstrap for mixture models works off the nonparametric maximum likelihood estimator for the mixing distribution. From a theoretical perspective, we prove the convergence and exchangeability of the sample sequences from the algorithm and also illustrate our results with different models and settings and some real data.
MoreTranslated text
Key words
Mixture Models,Nonparametric Bayesian,Gaussian Mixture Models,Bayesian Methods,Bayesian Inference
PDF
View via Publisher
AI Read Science
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined