Robust Bayesian non-parametric dictionary learning with heterogeneous Gaussian noise.

Computer Vision and Image Understanding(2016)

引用 3|浏览47
暂无评分
摘要
A robust Bayesian dictionary learning framework incorporating the t prior is proposed.This framework preserves the flexibility of non-parametric model.The Hybrid inference algorithm on the conjugate and non-conjugate parts of the model.Improved performance beyond the previous BNP model and the prevailing parametric methods when denoising heterogeneous noise. Bayesian non-parametric dictionary learning has become popular in computer vision applications due to its ability of dictionary size decision. A common assumption of this modelling approach is to place Gaussian priors on both dictionary matrix and weighting matrix. Although such simple treatment has a number of merits such as conjugate priors and easy inference, it may violate the reality since there may exist heterogeneous noise in a digital image. In this paper, we consider a general noise model for Bayesian non-parametric dictionary learning, which is able to adapt images with heterogeneous Gaussian noise. To this end, we adopt Student's t distributions as priors of heterogeneous noise for both dictionary matrix and weighting matrix. As an infinite Gaussian scale mixture, Student's t not only retains the similar properties as Gaussian but also tolerates different scales of noise. We propose an approximate inference algorithm, combining Gibbs sampling and empirical Bayesian, to estimate the posterior distribution of parameters. The experimental results show that the proposed model can clearly outperform the counterpart with Gaussian prior and the prevailing parametric methods in image de-noising with heterogeneous noise.
更多
查看译文
关键词
Heterogeneous noise,Bayesian nonparametric,Dictionary learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要