Riemannian structure of some new gradient descent learning algorithms

R. E. Mahoney, R. C. Williamson

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    2 Citations (Scopus)

    Abstract

    We consider some generalizations of the classical LMS learning algorithm including the exponentiated gradient (EG) algorithm. We show how one can develop these algorithms in terms of a prior distribution over the weight space. Our framework subsumes the notion of "link-functions". Differential geometric methods are used to develop the algorithms as gradient descent with respect to the natural gradient in the Riemannian structure induced by the prior distribution. This provides a Bayesian Riemannian interpretation of the EG and related algorithms. We relate our work to that of Amari (1985, 1997, 1998) and others who used similar tools in a different manner. Simulation experiments illustrating the behaviour of the new algorithms are presented.

    Original languageEnglish
    Title of host publicationIEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium, AS-SPCC 2000
    PublisherInstitute of Electrical and Electronics Engineers Inc.
    Pages197-202
    Number of pages6
    ISBN (Electronic)0780358007, 9780780358003
    DOIs
    Publication statusPublished - 2000
    EventIEEE Adaptive Systems for Signal Processing, Communications, and Control Symposium, AS-SPCC 2000 - Lake Louise, Canada
    Duration: 1 Oct 20004 Oct 2000

    Publication series

    NameIEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium, AS-SPCC 2000

    Conference

    ConferenceIEEE Adaptive Systems for Signal Processing, Communications, and Control Symposium, AS-SPCC 2000
    Country/TerritoryCanada
    CityLake Louise
    Period1/10/004/10/00

    Fingerprint

    Dive into the research topics of 'Riemannian structure of some new gradient descent learning algorithms'. Together they form a unique fingerprint.

    Cite this