Complexity of learning subspace juntas and ICA

Pacific Grove, CA(2013)

引用 3|浏览17
暂无评分
摘要
Inspired by feature selection problems in machine learning and statistics, we study classification problems where the label function depends only on an unknown low dimensional relevant subspace of the data (we call this a k-subspace junta). Assuming that the relevant subspace is truly independent of the irrelevant subspace, and that the distribution over the irrelevant subspace is Gaussian, we give a polynomial-time algorithm for recovering the relevant subspace and for learning; additionally, we require only a polynomial number of samples. Our main tool is the solution of a tensor optimization problem. In general, finding the global optimum of a tensor is NP-hard, but we avoid this difficulty by using only local optima.
更多
查看译文
关键词
Gaussian distribution,computational complexity,feature selection,independent component analysis,learning (artificial intelligence),optimisation,polynomials,statistics,tensors,Gaussian distribution,ICA,NP-hard problems,classification problems,feature selection problems,independent component analysis,irrelevant subspace,k-subspace junta,label function,learning subspace juntas,low dimensional relevant subspace,machine learning,polynomial-time algorithm,statistics,tensor optimization problem
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要