Gaining Outlier Resistance With Progressive Quantiles: Fast Algorithms and Theoretical Studies

JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION(2022)

引用 6|浏览12
暂无评分
摘要
Outliers widely occur in big-data applications and may severely affect statistical estimation and inference. In this article, a framework of outlier-resistant estimation is introduced to robustify an arbitrarily given loss function. It has a close connection to the method of trimming and includes explicit outlyingness parameters for all samples, which in turn facilitates computation, theory, and parameter tuning. To tackle the issues of nonconvexity and nonsmoothness, we develop scalable algorithms with implementation ease and guaranteed fast convergence. In particular, a new technique is proposed to alleviate the requirement on the starting point such that on regular datasets, the number of data resamplings can be substantially reduced. Based on combined statistical and computational treatments, we are able to perform nonasymptotic analysis beyond M-estimation. The obtained resistant estimators, though not necessarily globally or even locally optimal, enjoy minimax rate optimality in both low dimensions and high dimensions. Experiments in regression, classification, and neural networks show excellent performance of the proposed methodology at the occurrence of gross outliers. for this article are available online.
更多
查看译文
关键词
Algorithms, Mathematical statistics, Model selection, variable selection, Numerical optimization, Robust procedures
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要