AI helps you reading Science

AI generates interpretation videos

AI extracts and analyses the key points of the paper to generate videos automatically


pub
Go Generating

AI Traceability

AI parses the academic lineage of this thesis


Master Reading Tree
Generate MRT

AI Insight

AI extracts a summary of this paper


Weibo:
In contrast to most DP-based approaches, our construction is motivated by the intrinsic relation between Dirichlet processes and compound Poisson processes

Construction of Dependent Dirichlet Processes based on Poisson Processes.

NIPS, pp.1396-1404, (2010)

Cited by: 102|Views194
EI
Full Text
Bibtex
Weibo

Abstract

We present a novel method for constructing dependent Dirichlet processes. The approach exploits the intrinsic relationship between Dirichlet and Poisson processes in order to create a Markov chain of Dirichlet processes suitable for use as a prior over evolving mixture models. The method allows for the creation, removal, and location vari...More

Code:

Data:

0
Introduction
  • As the cornerstone of Bayesian nonparametric modeling, Dirichlet processes (DP) [22] have been applied to a wide variety of inference and estimation problems [3, 10, 20] with Dirichlet process mixtures (DPMs) [15, 17] being one of the most successful.
  • The traditional DPM model assumes that each sample is generated independently from the same DP
  • This assumption is limiting in cases when samples come from many, yet dependent, DPs. HDPs [23] partially address this modeling aspect by providing a way to construct multiple DPs implicitly depending on each other via a common parent.
  • HDPs [23] partially address this modeling aspect by providing a way to construct multiple DPs implicitly depending on each other via a common parent
  • Their hierarchical structure may not be appropriate in some problems
Highlights
  • As the cornerstone of Bayesian nonparametric modeling, Dirichlet processes (DP) [22] have been applied to a wide variety of inference and estimation problems [3, 10, 20] with Dirichlet process mixtures (DPMs) [15, 17] being one of the most successful
  • Motivated by the relationship between Poisson and Dirichlet processes, we develop a new approach for constructing dependent Dirichlet processes (DDPs)
  • We develop a Gibbs sampling procedure based on the construction of DDPs introduced above
  • We developed a principled framework for constructing dependent Dirichlet processes
  • In contrast to most DP-based approaches, our construction is motivated by the intrinsic relation between Dirichlet processes and compound Poisson processes
  • The simulations on synthetic data and the experiments on modeling people flows and paper topics clearly demonstrate that the proposed method is effective in estimating mixture models that evolve over time
Results
  • The authors present experimental results on both synthetic and real data.
  • The authors compare the method with dynamic FMM in modeling mixtures of Gaussians whose number and centers evolve over time.
  • 6.1 Simulations on Synthetic Data.
  • The data for simulations were synthesized as follows.
  • The authors initialized the model with two Gaussian components, and added new components following a temporal Poisson process (one per 20 phases median distance.
  • (c) For different diffusion var
Conclusion
  • Conclusion and Future Directions

    The authors developed a principled framework for constructing dependent Dirichlet processes.
  • In contrast to most DP-based approaches, the construction is motivated by the intrinsic relation between Dirichlet processes and compound Poisson processes.
  • The authors discussed three operations: superposition, subsampling, and point transition, which produce DPs depending on others.
  • The authors further combined these operations to derive a Markov chain of DPs, leading to a prior of mixture models that allows creation, removal, and location variation of component models under a unified formulation.
  • The simulations on synthetic data and the experiments on modeling people flows and paper topics clearly demonstrate that the proposed method is effective in estimating mixture models that evolve over time
Reference
  • A. Ahmed and E. Xing. Dynamic Non-Parametric Mixture Models and The Recurrent Chinese Restaurant Process: with Applications to Evolutionary Clustering. In Proc. of SDM’08, 2008.
    Google ScholarLocate open access versionFindings
  • F. R. Bach and M. I. Jordan. Learning spectral clustering. In Proc. of NIPS’03, 2003.
    Google ScholarLocate open access versionFindings
  • J. Boyd-Graber and D. M. Blei. Syntactic Topic Models. In Proc. of NIPS’08, 2008.
    Google ScholarLocate open access versionFindings
  • F. Caron, M. Davy, and A. Doucet. Generalized Polya Urn for Time-varying Dirichlet Process Mixtures. In Proc. of UAI’07, number 6, 2007.
    Google ScholarLocate open access versionFindings
  • Y. Chung and D. B. Dunson. The local Dirichlet Process. Annals of the Inst. of Stat. Math., (October 2007), January 2009.
    Google ScholarLocate open access versionFindings
  • [7] J. E. Griffin and M. F. J. Steel. Order-Based Dependent Dirichlet Processes. Journal of the American Statistical Association, 101(473):179–194, March 2006.
    Google ScholarLocate open access versionFindings
  • [8] J. E. Griffin and M. F. J. Steel. Time-Dependent Stick-Breaking Processes. Technical report, 2009.
    Google ScholarFindings
  • [9] J. F. C. Kingman. Poisson Processes. Oxford University Press, 1993.
    Google ScholarFindings
  • [10] J. J. Kivinen, E. B. Sudderth, and M. I. Jordan. Learning Multiscale Representations of Natural Scenes Using Dirichlet Processes. In Proc. of ICCV’07, 2007.
    Google ScholarLocate open access versionFindings
  • [11] D. Lin, E. Grimson, and J. Fisher. Learning Visual Flows: A Lie Algebraic Approach. In Proc. of
    Google ScholarLocate open access versionFindings
  • [12] S. N. MacEachern. Dependent Nonparametric Processes. In Proceedings of the Section on Bayesian Statistical Science, 1999.
    Google ScholarLocate open access versionFindings
  • [13] M. Meila. Comparing clusterings - An Axiomatic View. In Proc. of ICML’05, 2005.
    Google ScholarLocate open access versionFindings
  • [14] P. Muller, F. Quintana, and G. Rosner. A Method for Combining Inference across Related Nonparametric Bayesian Models. J. R. Statist. Soc. B, 66(3):735–749, August 2004.
    Google ScholarLocate open access versionFindings
  • [15] R. M. Neal. Markov Chain Sampling Methods for Dirichlet Process Mixture Models. Journal of computational and graphical statistics, 9(2):249–265, 2000.
    Google ScholarLocate open access versionFindings
  • [16] V. Rao and Y. W. Teh. Spatial Normalized Gamma Processes. In Proc. of NIPS’09, 2009.
    Google ScholarLocate open access versionFindings
  • [17] C. E. Rasmussen. The Infinite Gaussian Mixture Model. In Proc. of NIPS’00, 2000.
    Google ScholarLocate open access versionFindings
  • [18] L. Ren, D. B. Dunson, and L. Carin. The Dynamic Hierarchical Dirichlet Process. In Proc. of ICML’08, New York, New York, USA, 2008. ACM Press.
    Google ScholarLocate open access versionFindings
  • [19] J. Sethuraman. A Constructive Definition of Dirichlet Priors. Statistica Sinica, 4(2):639–650, 1994.
    Google ScholarLocate open access versionFindings
  • [20] K.-a. Sohn and E. Xing. Hidden Markov Dirichlet process: modeling genetic recombination in open ancestral space. In Proc. of NIPS’07, 2007.
    Google ScholarLocate open access versionFindings
  • [21] N. Srebro and S. Roweis. Time-Varying Topic Models using Dependent Dirichlet Processes, 2005.
    Google ScholarLocate open access versionFindings
  • [22] Y. W. Teh. Dirichlet Process, 2007.
    Google ScholarLocate open access versionFindings
  • [23] Y. W. Teh, M. I. Jordan, M. J. Beal, and D. M. Blei. Hierarchical Dirichlet Processes. Journal of the American Statistical Association, 101(476):1566–1581, 2006.
    Google ScholarLocate open access versionFindings
  • [24] X. Zhu and J. Lafferty. Time-Sensitive Dirichlet Process Mixture Models, 2005.
    Google ScholarLocate open access versionFindings
Your rating :
0

 

Tags
Comments
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn
小科