Abstract
This chapter discusses a Riemannian approach to the low-rank tensor learning problem. The low-rank constraint is modeled as a fixed-rank Tucker decomposition of tensors. We endow the manifold arising from the Tucker decomposition with a Riemannian structure based on a specific metric or inner product. It allows to use the versatile framework of Riemannian optimization on quotient manifolds to develop optimization algorithms. The Riemannian framework conceptually translates a structured constraint problem into an unconstrained problem over a Riemannian manifold. We employ a nonlinear conjugate gradient algorithm for optimization. To this end, concrete matrix expressions of various Riemannian optimization-related ingredients are discussed. Numerical comparisons on problems of low-rank tensor completion, tensor regression, and multilinear multitask learning suggest that the proposed Riemannian approach performs well across different synthetic and real-world datasets.
Original language | English |
---|---|
Title of host publication | Tensors for Data Processing |
Subtitle of host publication | Theory, Methods, and Applications |
Publisher | Elsevier |
Pages | 91-119 |
Number of pages | 29 |
ISBN (Electronic) | 9780128244470 |
ISBN (Print) | 9780323859653 |
DOIs | |
Publication status | Published - 2021 Jan 1 |
Keywords
- Multilinear multitask learning
- Riemannian optimization
- Tensor completion
- Tensor regression
- Tucker decomposition
ASJC Scopus subject areas
- Engineering(all)
- Computer Science(all)