Exploiting Problem Structure in Deep Declarative Networks: Two Case Studies

Stephen Gould, Dylan Campbell, Itzik Ben-Shabat, Chamin Hewa Koneputugodage, Zhiwei Xu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Deep declarative networks and other recent related works have shown how to differentiate the solution map of a (continuous) parametrized optimization problem, opening up the
possibility of embedding mathematical optimization problems into end-to-end learnable models. These differentiability results can lead to significant memory savings by providing
an expression for computing the derivative without needing to unroll the steps of the forward-pass optimization procedure during the backward pass. However, the results typically require inverting a large Hessian matrix, which is computationally expensive when implemented naively. In this work we study two applications of deep declarative networks—robust
vector pooling and optimal transport—and show how problem structure can be exploited to obtain very efficient backward pass computations in terms of both time and memory.
Our ideas can be used as a guide for improving the computational performance of other novel deep declarative nodes.
Original languageEnglish
Title of host publicationThe AAAI Conference on Artificial Intelligence Workshop on Optimal Transport and Structured Data Modeling (OT-SDM)
PublisherAssociation for the Advancement of Artificial Intelligence (AAAI)
Pages1-7
Number of pages7
Publication statusPublished - 2022

Fingerprint

Dive into the research topics of 'Exploiting Problem Structure in Deep Declarative Networks: Two Case Studies'. Together they form a unique fingerprint.

Cite this