Scalable Symmetric Tucker Tensor Decomposition

Abstract

We study the best low-rank Tucker decomposition of symmetric tensors. The motivating application is decomposing higher-order multivariate moments. Moment tensors have special structure and are important to various data science problems. We advocate for a projected gradient descent (PGD) method and higher-order eigenvalue decomposition (HOEVD) approximation as computation schemes. Most importantly, we develop scalable adaptations of the basic PGD and HOEVD methods to decompose sample moment tensors. With the help of implicit and streaming techniques, we evade the overhead cost of building and storing the moment tensor. Such reductions make computing the Tucker decomposition realizable for large data instances in high dimensions. Numerical experiments demonstrate the efficiency of the algorithms and the applicability of moment tensor decompositions to real-world datasets. Finally we study the convergence on the Grassmannian manifold and prove that the update sequence derived by the PGD solver achieves first- and second-order criticality.

Publication
SIAM Journal on Matrix Analysis and Applications
Date
Tags
Citation
R. Jin, J. Kileel, T. G. Kolda, R. Ward. Scalable Symmetric Tucker Tensor Decomposition. SIAM Journal on Matrix Analysis and Applications, 2024. http://arxiv.org/abs/2204.10824

Keywords

symmetric tensor,Tucker decomposition,scalable algorithm

Comments

In press, accepted Apr 9 2024

BibTeX

@article{JiKiKoWa24,  
author = {Ruhui Jin and Joe Kileel and Tamara G. Kolda and Rachel Ward}, 
title = {Scalable Symmetric {Tucker} Tensor Decomposition}, 
journal = {SIAM Journal on Matrix Analysis and Applications}, 
month = {April}, 
year = {2024},
eprint = {2204.10824},
}