Asynchronous Distributed Optimization via Dual Decomposition and Block Coordinate Subgradient Methods

Yankai Lin*, Iman Shames, Dragan Nesic

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    4 Citations (Scopus)

    Abstract

    In this article, we study the problem of minimizing the sum of potentially nondifferentiable convex cost functions with partially overlapping dependences in an asynchronous manner, where communication in the network is not coordinated. We study the behavior of an asynchronous algorithm based on dual decomposition and block coordinate subgradient methods under assumptions weaker than those used in the literature. At the same time, we allow different agents to use local stepsizes with no global coordination. Sufficient conditions are provided for almost sure convergence to the solution of the optimization problem. Under additional assumptions, we establish a sublinear convergence rate that, in turn, can be strengthened to the linear convergence rate if the problem is strongly convex and has Lipschitz gradients. We also extend available results in the literature by allowing multiple and potentially overlapping blocks to be updated at the same time with nonuniform and potentially time-varying probabilities assigned to different blocks. A numerical example is provided to illustrate the effectiveness of the algorithm.

    Original languageEnglish
    Pages (from-to)1348-1359
    Number of pages12
    JournalIEEE Transactions on Control of Network Systems
    Volume8
    Issue number3
    DOIs
    Publication statusPublished - Sept 2021

    Fingerprint

    Dive into the research topics of 'Asynchronous Distributed Optimization via Dual Decomposition and Block Coordinate Subgradient Methods'. Together they form a unique fingerprint.

    Cite this