On Convergence Analysis of Gradient Based Primal-Dual Method of Multipliers

Guoqiang Zhang, Matthew Orconnor, Le Li

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    1 Citation (Scopus)

    Abstract

    Recently, the primal-dual method of multipliers (PDMM) has been proposed and successfully applied to solve a number of decomposable convex optimizations distributedly and iteratively. In this work, we study the gradient based PDMM (GPDMM), where the objective functions are approximated using the gradient information per iteration. It is shown that for a certain class of decomposable convex optimizations, synchronous GPDMM has a sublinear convergence rate of O(1/K) (where K denotes the iteration index). Experiments on a problem of distributed ridge regularized logistic regression demonstrate the efficiency of synchronous GPDMM.

    Original languageEnglish
    Title of host publication2018 IEEE Statistical Signal Processing Workshop, SSP 2018
    PublisherInstitute of Electrical and Electronics Engineers Inc.
    Pages353-357
    Number of pages5
    ISBN (Print)9781538615706
    DOIs
    Publication statusPublished - 29 Aug 2018
    Event20th IEEE Statistical Signal Processing Workshop, SSP 2018 - Freiburg im Breisgau, Germany
    Duration: 10 Jun 201813 Jun 2018

    Publication series

    Name2018 IEEE Statistical Signal Processing Workshop, SSP 2018

    Conference

    Conference20th IEEE Statistical Signal Processing Workshop, SSP 2018
    Country/TerritoryGermany
    CityFreiburg im Breisgau
    Period10/06/1813/06/18

    Fingerprint

    Dive into the research topics of 'On Convergence Analysis of Gradient Based Primal-Dual Method of Multipliers'. Together they form a unique fingerprint.

    Cite this