TY - JOUR
T1 - A Riemannian gossip approach to subspace learning on Grassmann manifold
AU - Mishra, Bamdev
AU - Kasai, Hiroyuki
AU - Jawanpuria, Pratik
AU - Saroop, Atul
N1 - Publisher Copyright:
© 2019, The Author(s), under exclusive licence to Springer Science+Business Media LLC, part of Springer Nature.
PY - 2019/10/14
Y1 - 2019/10/14
N2 - In this paper, we focus on subspace learning problems on the Grassmann manifold. Interesting applications in this setting include low-rank matrix completion and low-dimensional multivariate regression, among others. Motivated by privacy concerns, we aim to solve such problems in a decentralized setting where multiple agents have access to (and solve) only a part of the whole optimization problem. The agents communicate with each other to arrive at a consensus, i.e., agree on a common quantity, via the gossip protocol. We propose a novel cost function for subspace learning on the Grassmann manifold, which is a weighted sum of several sub-problems (each solved by an agent) and the communication cost among the agents. The cost function has a finite-sum structure. In the proposed modeling approach, different agents learn individual local subspaces but they achieve asymptotic consensus on the global learned subspace. The approach is scalable and parallelizable. Numerical experiments show the efficacy of the proposed decentralized algorithms on various matrix completion and multivariate regression benchmarks.
AB - In this paper, we focus on subspace learning problems on the Grassmann manifold. Interesting applications in this setting include low-rank matrix completion and low-dimensional multivariate regression, among others. Motivated by privacy concerns, we aim to solve such problems in a decentralized setting where multiple agents have access to (and solve) only a part of the whole optimization problem. The agents communicate with each other to arrive at a consensus, i.e., agree on a common quantity, via the gossip protocol. We propose a novel cost function for subspace learning on the Grassmann manifold, which is a weighted sum of several sub-problems (each solved by an agent) and the communication cost among the agents. The cost function has a finite-sum structure. In the proposed modeling approach, different agents learn individual local subspaces but they achieve asymptotic consensus on the global learned subspace. The approach is scalable and parallelizable. Numerical experiments show the efficacy of the proposed decentralized algorithms on various matrix completion and multivariate regression benchmarks.
KW - Manifold optimization
KW - Matrix completion
KW - Multivariate regression
KW - Non-linear gossip
KW - Stochastic gradients
UR - http://www.scopus.com/inward/record.url?scp=85060588594&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85060588594&partnerID=8YFLogxK
U2 - 10.1007/s10994-018-05775-x
DO - 10.1007/s10994-018-05775-x
M3 - Article
AN - SCOPUS:85060588594
SN - 0885-6125
VL - 108
SP - 1783
EP - 1803
JO - Machine Learning
JF - Machine Learning
IS - 10
ER -