TY - GEN
T1 - Riemannian adaptive stochastic gradient algorithms on matrix manifolds
AU - Kasai, Hiroyuki
AU - Jawanpuria, Pratik
AU - Mishra, Bamdev
N1 - Funding Information:
H. Kasai was partially supported by JSPS KAKENHI Grant Numbers JP16K00031 and JP17H01732.
Publisher Copyright:
Copyright © 2019 ASME
PY - 2019
Y1 - 2019
N2 - Adaptive stochastic gradient algorithms in the Euclidean space have attracted much attention lately. Such explorations on Riemannian manifolds, on the other hand, are relatively new, limited, and challenging. This is because of the intrinsic nonlinear structure of the underlying manifold and the absence of a canonical coordinate system. In machine learning applications, however, most manifolds of interest are represented as matrices with notions of row and column subspaces. In addition, the implicit manifold-related constraints may also lie on such subspaces. For example, the Grassmann manifold is the set of column subspaces. To this end, such a rich structure should not be lost by transforming matrices to just a stack of vectors while developing optimization algorithms on manifolds. We propose novel stochastic gradient algorithms for problems on Riemannian matrix manifolds by adapting the row and column sub-spaces of gradients. Our algorithms are provably convergent and they achieve the convergence rate of order O(log(T)/√T), where T is the number of iterations. Our experiments illustrate the efficacy of the proposed algorithms on several applications.
AB - Adaptive stochastic gradient algorithms in the Euclidean space have attracted much attention lately. Such explorations on Riemannian manifolds, on the other hand, are relatively new, limited, and challenging. This is because of the intrinsic nonlinear structure of the underlying manifold and the absence of a canonical coordinate system. In machine learning applications, however, most manifolds of interest are represented as matrices with notions of row and column subspaces. In addition, the implicit manifold-related constraints may also lie on such subspaces. For example, the Grassmann manifold is the set of column subspaces. To this end, such a rich structure should not be lost by transforming matrices to just a stack of vectors while developing optimization algorithms on manifolds. We propose novel stochastic gradient algorithms for problems on Riemannian matrix manifolds by adapting the row and column sub-spaces of gradients. Our algorithms are provably convergent and they achieve the convergence rate of order O(log(T)/√T), where T is the number of iterations. Our experiments illustrate the efficacy of the proposed algorithms on several applications.
UR - http://www.scopus.com/inward/record.url?scp=85078230738&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85078230738&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85078230738
T3 - 36th International Conference on Machine Learning, ICML 2019
SP - 5699
EP - 5708
BT - 36th International Conference on Machine Learning, ICML 2019
PB - International Machine Learning Society (IMLS)
T2 - 36th International Conference on Machine Learning, ICML 2019
Y2 - 9 June 2019 through 15 June 2019
ER -