Classification With the Sparse Group Lasso

Signal Processing, IEEE Transactions(2016)

引用 53|浏览38
暂无评分
摘要
Classification with a sparsity constraint on the solution plays a central role in many high dimensional signal processing applications. In some cases, the features can be grouped together, so that entire subsets of features can be selected or discarded. In many applications, however, this can be too restrictive. In this paper, we are interested in a less restrictive form of structured sparse feature selection: We assume that while features can be grouped according to some notion of similarity, not all features in a group need be selected for the task at hand. The Sparse Group Lasso (SGL) was proposed to solve problems of this form. The main contributions of this paper are a new procedure called Sparse Overlapping Group (SOG) lasso, an extension to the SGL to overlapping groups and theoretical sample complexity bounds for the same. We establish model selection error bounds that specializes to many other cases. We experimentally validate our proposed method on both real and toy datasets.
更多
查看译文
关键词
Algorithms,compressed sensing,statistical learning,structured sparsity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要