Multi-step gradient methods for networked optimization

Euhanna Ghadimi, Iman Shames, Mikael Johansson

Research output: Contribution to journalArticlepeer-review

78 Citations (Scopus)

Abstract

We develop multi-step gradient methods for network-constrained optimization of strongly convex functions with Lipschitz-continuous gradients. Given the topology of the underlying network and bounds on the Hessian of the objective function, we determine the algorithm parameters that guarantee the fastest convergence and characterize situations when significant speed-ups over the standard gradient method are obtained. Furthermore, we quantify how uncertainty in problem data at design-time affects the run-time performance of the gradient method and its multi-step counterpart, and conclude that in most cases the multi-step method outperforms gradient descent. Finally, we apply the proposed technique to three engineering problems: resource allocation under network-wide budget constraint, distributed averaging, and Internet congestion control. In all cases, our proposed algorithms converge significantly faster than the state-of-the art.

Original languageEnglish
Article number6578176
Pages (from-to)5417-5429
Number of pages13
JournalIEEE Transactions on Signal Processing
Volume61
Issue number21
DOIs
Publication statusPublished - 2013
Externally publishedYes

Fingerprint

Dive into the research topics of 'Multi-step gradient methods for networked optimization'. Together they form a unique fingerprint.

Cite this