Power Normalizations in Fine-Grained Image, Few-Shot Image and Graph Classification

Piotr Koniusz*, Hongguang Zhang

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    25 Citations (Scopus)


    Power Normalizations (PN) are useful non-linear operators which tackle feature imbalances in classification problems. We study PNs in the deep learning setup via a novel PN layer pooling feature maps. Our layer combines the feature vectors and their respective spatial locations in the feature maps produced by the last convolutional layer of CNN into a positive definite matrix with second-order statistics to which PN operators are applied, forming so-called Second-order Pooling (SOP). As the main goal of this paper is to study Power Normalizations, we investigate the role and meaning of MaxExp and Gamma, two popular PN functions. To this end, we provide probabilistic interpretations of such element-wise operators and discover surrogates with well-behaved derivatives for end-to-end training. Furthermore, we look at the spectral applicability of MaxExp and Gamma by studying Spectral Power Normalizations (SPN). We show that SPN on the autocorrelation/covariance matrix and the Heat Diffusion Process (HDP) on a graph Laplacian matrix are closely related, thus sharing their properties. Such a finding leads us to the culmination of our work, a fast spectral MaxExp which is a variant of HDP for covariances/autocorrelation matrices. We evaluate our ideas on fine-grained recognition, scene recognition, and material classification, as well as in few-shot learning and graph classification.

    Original languageEnglish
    Pages (from-to)591-609
    Number of pages19
    JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
    Issue number2
    Publication statusPublished - 1 Feb 2022


    Dive into the research topics of 'Power Normalizations in Fine-Grained Image, Few-Shot Image and Graph Classification'. Together they form a unique fingerprint.

    Cite this