Modeling and Learning on High-Dimensional Matrix-Variate Sequences

Xu Zhang, Catherine C. Liu*, Jianhua Guo*, K. C. Yuen, A. H. Welsh

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

We propose a new matrix factor model, named RaDFaM, which is strictly derived from the general rank decomposition and assumes a high-dimensional vector factor model structure for each basis vector. RaDFaM contributes a novel class of low-rank latent structures that trade off between signal intensity and dimension reduction from a tensor subspace perspective. Based on the intrinsic separable covariance structure of RaDFaM, for a collection of matrix-valued observations, we derive a new class of PCA variants for estimating loading matrices, and sequentially the latent factor matrices. The peak signal-to-noise ratio of RaDFaM is proved to be superior in the category of PCA-type estimators. We also establish an asymptotic theory including the consistency, convergence rates, and asymptotic distributions for components in the signal part. Numerically, we demonstrate the performance of RaDFaM in applications such as matrix reconstruction, supervised learning, and clustering, on uncorrelated and correlated data, respectively. Supplementary materials for this article are available online, including a standardized description of the materials available for reproducing the work.

Original languageEnglish
Number of pages16
JournalJournal of the American Statistical Association
DOIs
Publication statusPublished - 23 May 2024

Fingerprint

Dive into the research topics of 'Modeling and Learning on High-Dimensional Matrix-Variate Sequences'. Together they form a unique fingerprint.

Cite this