A Gyrovector Space Approach for Symmetric Positive Semi-definite Matrix Learning.

European Conference on Computer Vision(2022)

引用 3|浏览1
暂无评分
摘要
Representation learning with Symmetric Positive Semi-definite (SPSD) matrices has proven effective in many machine learning problems. Recently, some SPSD neural networks have been proposed and shown promising performance. While these works share a common idea of generalizing some basic operations in deep neural networks (DNNs) to the SPSD manifold setting, their proposed generalizations are usually designed in an ad hoc manner. In this work, we make an attempt to propose a principled framework for building such generalizations. Our method is motivated by the success of hyperbolic neural networks (HNNs) that have demonstrated impressive performance in a variety of applications. At the heart of HNNs is the theory of gyrovector spaces that provides a powerful tool for studying hyperbolic geometry. Here we consider connecting the theory of gyrovector spaces and the Riemannian geometry of SPSD manifolds. We first propose a method to define basic operations, i.e., binary operation and scalar multiplication in gyrovector spaces of (full-rank) Symmetric Positive Definite (SPD) matrices. We then extend these operations to the low-rank SPSD manifold setting. Finally, we present an approach for building SPSD neural networks. Experimental evaluations on three benchmarks for human activity recognition demonstrate the efficacy of our proposed framework.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要