Sample Complexity of Model-Based Search.

COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory(1998)

引用 0|浏览0
暂无评分
摘要
We consider the problem of searching a domain for points that have a desired property, in the special case where the objective function that determines the properties of points is unknown and must be learned during search. We give a parallel to PAC learning theory that is appropriate for reasoning about the sample complexity of this problem. The learner queries the true objective function at selected points, and uses this information to choose models of the objective function from a given hypothesis class that is known to contain a correct model. These models are used to focus the search on more promising areas of the domain. The goal is to find a point with the desired property in a small number of queries. We define an analog to VC dimension, needle dimension, to be the size of the largest sample in which any single point could have the desired property without the other points' values revealing this information. We give an upper bound on sample complexity that is linear in needle dimension for a natural type of search protocol and a linear lower bound for a class of constrained problems. We also describe the relationship between needle dimension and VC dimension, explore connections between model-based search and active concept learning (including several novel positive results in active learning), and consider a scale-sensitive version of needle dimension. Several simple examples illustrate the dependence of needle dimension on features of search problems.
更多
查看译文
关键词
model-based search,sample complexity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要