Abstract
The papers in this special section focus on tensor decomposition for signal processing and machine learning. Tensor decomposition, also called tensor factorization, is useful for representing and analyzing multi-dimensional data. Tensor decompositions have been applied in signal processing applications (speech, acoustics, communications, radar, biomedicine), machine learning (clustering, dimensionality reduction, latent factor models, subspace learning), and well beyond. These tools aid in learning a variety of models, including community models, probabilistic context-free-grammars, Gaussian mixture model, and two-layer neural networks. Although considerable research has been carried out in this area, there are many challenges still outstanding that need to be explored and addressed; some examples being tensor deflation, massive tensor decompositions, and high computational cost of algorithms. The multi-dimensional nature of the signals and even bigger data, particularly in next-generation advanced information and communication technology systems, provide good opportunities to exploit tensor-based models and tensor networks, with the aim of meeting the strong requirements on system flexibility, convergence, and efficiency.
| Original language | English |
|---|---|
| Article number | 9393485 |
| Pages (from-to) | 433-437 |
| Number of pages | 5 |
| Journal | IEEE Journal on Selected Topics in Signal Processing |
| Volume | 15 |
| Issue number | 3 |
| DOIs | |
| Publication status | Published - Apr 2021 |
Fingerprint
Dive into the research topics of 'Introduction to the special issue on tensor decomposition for signal processing and machine learning'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver