Relative loss bounds for multidimensional regression problems

J. Kivinen*, M. K. Warmuth

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    78 Citations (Scopus)

    Abstract

    We study on-line generalized linear regression with multidimensional outputs, i.e., neural networks with multiple output nodes but no hidden nodes. We allow at the final layer transfer functions such as the softmax function that need to consider the linear activations to all the output neurons. The weight vectors used to produce the linear activations are represented indirectly by maintaining separate parameter vectors. We get the weight vector by applying a particular parameterization function to the parameter vector. Updating the parameter vectors upon seeing new examples is done additively, as in the usual gradient descent update. However, by using a nonlinear parameterization function between the parameter vectors and the weight vectors, we can make the resulting update of the weight vector quite different from a true gradient descent update. To analyse such updates, we define a notion of a matching loss function and apply it both to the transfer function and to the parameterization function. The loss function that matches the transfer function is used to measure the goodness of the predictions of the algorithm. The loss function that matches the parameterization function can be used both as a measure of divergence between models in motivating the update rule of the algorithm and as a measure of progress in analyzing its relative performance compared to an arbitrary fixed model. As a result, we have a unified treatment that generalizes earlier results for the gradient descent and exponentiated gradient algorithms to multidimensional outputs, including multiclass logistic regression.

    Original languageEnglish
    Pages (from-to)301-329
    Number of pages29
    JournalMachine Learning
    Volume45
    Issue number3
    DOIs
    Publication statusPublished - Dec 2001

    Fingerprint

    Dive into the research topics of 'Relative loss bounds for multidimensional regression problems'. Together they form a unique fingerprint.

    Cite this