A Riemannian approach to low-rank tensor learning

Hiroyuki Kasai, Pratik Jawanpuria, Bamdev Mishra

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

This chapter discusses a Riemannian approach to the low-rank tensor learning problem. The low-rank constraint is modeled as a fixed-rank Tucker decomposition of tensors. We endow the manifold arising from the Tucker decomposition with a Riemannian structure based on a specific metric or inner product. It allows to use the versatile framework of Riemannian optimization on quotient manifolds to develop optimization algorithms. The Riemannian framework conceptually translates a structured constraint problem into an unconstrained problem over a Riemannian manifold. We employ a nonlinear conjugate gradient algorithm for optimization. To this end, concrete matrix expressions of various Riemannian optimization-related ingredients are discussed. Numerical comparisons on problems of low-rank tensor completion, tensor regression, and multilinear multitask learning suggest that the proposed Riemannian approach performs well across different synthetic and real-world datasets.

Original languageEnglish
Title of host publicationTensors for Data Processing
Subtitle of host publicationTheory, Methods, and Applications
PublisherElsevier
Pages91-119
Number of pages29
ISBN (Electronic)9780128244470
ISBN (Print)9780323859653
DOIs
Publication statusPublished - 2021 Jan 1

Keywords

  • Multilinear multitask learning
  • Riemannian optimization
  • Tensor completion
  • Tensor regression
  • Tucker decomposition

ASJC Scopus subject areas

  • Engineering(all)
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'A Riemannian approach to low-rank tensor learning'. Together they form a unique fingerprint.

Cite this