High-Performance Semi-Supervised Learning using Discriminatively Constrained Generative Models
ICML(2010)
摘要
We develop a semi-supervised learning method that constrains the posterior distri- bution of latent variables under a genera- tive model to satisfy a rich set of feature ex- pectation constraints estimated with labeled data. This approach encourages the genera- tive model to discover latent structure that is relevant to a prediction task. We esti- mate parameters with a coordinate ascent al- gorithm, one step of which involves training a discriminative log-linear model with an em- bedded generative model. This hybrid model can be used for test time prediction. Un- like other high-performance semi-supervised methods, the proposed algorithm converges to a stationary point of a single objective function, and aords additional exibility, for example to use dierent latent and output spaces. We conduct experiments on three se- quence labeling tasks, achieving the best re- ported results on two of them, and showing promising results on CoNLL03 NER.
更多查看译文
关键词
semi supervised learning,log linear model,satisfiability,latent variable,objective function
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要