Bayesian Active Learning With Non-Persistent Noise

Information Theory, IEEE Transactions(2015)

引用 18|浏览13
暂无评分
摘要
We consider the problem of noisy Bayesian active learning where we are given a finite set of functions , a sample space , and a label set . One of the functions in assigns labels to samples in . The goal is to identify the function that generates the labels even though the result of a label query on a sample is corrupted by independent noise. More precisely, the objective is to declare one of the functions in as the true label generating function with high confidence using as a few label queries as possible, by selecting the queries adaptively and in a strategic manner. Previous work in Bayesian active learning considers generalized binary search and its variants for the noisy case, and analyzes the number of queries required by these sampling strategies. In this paper, we show that these schemes are, in general, suboptimal. Instead we propose and analyze an alternative strategy for sample collection. Our sampling strategy is motivated by a connection between Bayesian active learning and active hypothesis testing, and is based on querying the label of a sample, which maximizes the extrinsic Jensen–Shannon divergence at each step. We provide upper and lower bounds on the performance of this sampling strategy, and show that these bounds are better than the previous bounds in the literature.
更多
查看译文
关键词
bayesian active learning,extrinsic jensen–shannon divergence,extrinsic jensen???shannon divergence,generalized binary search,hypothesis testing,upper bound,sample space,jensen shannon divergence,testing,noise measurement,learning artificial intelligence,noise,finite set
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要