DS-MLR: Exploiting Double Separability for Scaling up Distributed Multinomial Logistic Regression.

arXiv: Learning(2016)

引用 24|浏览125
暂无评分
摘要
Scaling multinomial logistic regression to datasets with very large number of data points and classes has not been trivial. This is primarily because one needs to compute the log-partition function on every data point. This makes distributing the computation hard. In this paper, we present a distributed stochastic gradient descent based optimization method (DS-MLR) for scaling up multinomial logistic regression problems to massive scale datasets without hitting any storage constraints on the data and model parameters. Our algorithm exploits double-separability, an attractive property we observe in the objective functions of several models in machine learning, that allows us to achieve both data as well as model parallelism simultaneously. In addition to being parallelizable, our algorithm can also easily be made non-blocking and asynchronous. We demonstrate the effectiveness of DS-MLR empirically on several real-world datasets, the largest being a reddit dataset created out of 1.7 billion user comments, where the data and parameter sizes are 228 GB and 358 GB respectively.
更多
查看译文
关键词
scaling,double separability,ds-mlr
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要