A globally convergent conjugate gradient method for minimizing self-concordant functions with application to constrained optimisation problems

Huibo Ji*, Minyi Huang, John B. Moore, Jonathan H. Manton

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    Abstract

    Self-concordant functions are a special class of convex functions introduced by Nesterov and Nemirovskii and used in interior point methods. This paper proposes a damped conjugate gradient method for optimization of self-concordant functions. This method is an ordinary conjugate gradient method but with a novel step-size selection rule which is proved to ensure the algorithm converges to the global minimum. As an example, the algorithm is applied to a quadratically constrained quadratic optimization problem.

    Original languageEnglish
    Title of host publicationProceedings of the 2007 American Control Conference, ACC
    Pages540-545
    Number of pages6
    DOIs
    Publication statusPublished - 2007
    Event2007 American Control Conference, ACC - New York, NY, United States
    Duration: 9 Jul 200713 Jul 2007

    Publication series

    NameProceedings of the American Control Conference
    ISSN (Print)0743-1619

    Conference

    Conference2007 American Control Conference, ACC
    Country/TerritoryUnited States
    CityNew York, NY
    Period9/07/0713/07/07

    Fingerprint

    Dive into the research topics of 'A globally convergent conjugate gradient method for minimizing self-concordant functions with application to constrained optimisation problems'. Together they form a unique fingerprint.

    Cite this