Predictive online convex optimization

Antoine Lesage-Landry*, Iman Shames, Joshua A. Taylor

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)

Abstract

We incorporate future information in the form of the estimated value of future gradients in online convex optimization. This is motivated by demand response in power systems, where forecasts about the current round, e.g., the weather or the loads’ behavior, can be used to improve on predictions made with only past observations. Specifically, we introduce an additional predictive step that follows the standard online convex optimization step when certain conditions on the estimated gradient and descent direction are met. We show that under these conditions and without any assumptions on the predictability of the environment, the predictive update strictly improves on the performance of the standard update. We give two types of predictive update for various family of loss functions. We provide a regret bound for each of our predictive online convex optimization algorithms. Finally, we apply our framework to an example based on demand response which demonstrates its superior performance to a standard online convex optimization algorithm.

Original languageEnglish
Article number108771
JournalAutomatica
Volume113
DOIs
Publication statusPublished - Mar 2020
Externally publishedYes

Fingerprint

Dive into the research topics of 'Predictive online convex optimization'. Together they form a unique fingerprint.

Cite this