Introduction to the special issue on tensor decomposition for signal processing and machine learning

Hongyang Chen, Sergiy A. Vorobyov, Hing Cheung So, Fauzia Ahmad, Fatih Porikli

    Research output: Contribution to journalEditorialpeer-review

    2 Citations (Scopus)

    Abstract

    The papers in this special section focus on tensor decomposition for signal processing and machine learning. Tensor decomposition, also called tensor factorization, is useful for representing and analyzing multi-dimensional data. Tensor decompositions have been applied in signal processing applications (speech, acoustics, communications, radar, biomedicine), machine learning (clustering, dimensionality reduction, latent factor models, subspace learning), and well beyond. These tools aid in learning a variety of models, including community models, probabilistic context-free-grammars, Gaussian mixture model, and two-layer neural networks. Although considerable research has been carried out in this area, there are many challenges still outstanding that need to be explored and addressed; some examples being tensor deflation, massive tensor decompositions, and high computational cost of algorithms. The multi-dimensional nature of the signals and even bigger data, particularly in next-generation advanced information and communication technology systems, provide good opportunities to exploit tensor-based models and tensor networks, with the aim of meeting the strong requirements on system flexibility, convergence, and efficiency.
    Original languageEnglish
    Article number9393485
    Pages (from-to)433-437
    Number of pages5
    JournalIEEE Journal on Selected Topics in Signal Processing
    Volume15
    Issue number3
    DOIs
    Publication statusPublished - Apr 2021

    Fingerprint

    Dive into the research topics of 'Introduction to the special issue on tensor decomposition for signal processing and machine learning'. Together they form a unique fingerprint.

    Cite this