Nonnegative Tensor Completion: step-sizes for an accelerated variation of the stochastic gradient descent

European Signal Processing Conference (EUSIPCO)(2022)

引用 0|浏览1
暂无评分
摘要
We consider the problem of nonnegative tensor completion. We adopt the alternating optimization framework and solve each nonnegative matrix least-squares problem via an accelerated variation of the stochastic gradient descent. The step-sizes used by the algorithm determine, to a high extent, its behavior. We propose two new strategies for the computation of step-sizes and we experimentally test their effectiveness using both synthetic and real-world data.
更多
查看译文
关键词
accelerated variation,step-sizes
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要