SVD-free Convex-Concave Approaches for Nuclear Norm Regularization.

IJCAI(2017)

引用 9|浏览72
暂无评分
摘要
Minimizing a convex function of matrices regularized by the nuclear norm arises in many applications such as collaborative filtering and multitask learning. In this paper, we study the general setting where the convex function could be non-smooth. When the size of the data matrix, denoted by m × n , is very large, existing optimization methods are inefficient because in each iteration, they need to perform a singular value decomposition (SVD) which takes O(m 2 n) time. To reduce the computation cost, we exploit the dual characterization of the nuclear norm to introduce a convex-concave optimization problem and design a subgradient-based algorithm without performing SVD. In each iteration, the proposed algorithm only computes the largest singular vector, reducing the time complexity from O(m 2 n) to O(mn) . To the best of our knowledge, this is the first SVD-free convex optimization approach for nuclear-norm regularized problems that does not rely on the smoothness assumption. Theoretical analysis shows that the proposed algorithm converges at an optimal O (1/√ T ) rate where T is the number of iterations. We also extend our algorithm to the stochastic case where only stochastic subgradients of the convex function are available and a special case that contains an additional non-smooth regularizer (e.g., l 1 norm regularizer). We conduct experiments on robust lowrank matrix approximation and link prediction to demonstrate the efficiency of our algorithms.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要