M2SGD: Learning to learn important weights

Nicholas I.Hsien Kuo, Mehrtash Harandi, Nicolas Fourrier, Christian Walder, Gabriela Ferraro, Hanna Suominen

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Citations (Scopus)

Abstract

Meta-learning concerns rapid knowledge acquisition. One popular approach cast optimisation as a learning problem and it has been shown that learnt neural optimisers updated base learners more quickly than their handcrafted counterparts. In this paper, we learn an optimisation rule that sparsely updates the learner parameters and removes redundant weights. We present Masked Meta-SGD (M2SGD), a neural optimiser which is not only capable of updating learners quickly, but also capable of removing 83.71% weights for ResNet20s.We release our codes at https://github.com/Nic5472K/CLVISION2020-CVPR-M2SGD.

Original languageEnglish
Title of host publicationProceedings - 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2020
PublisherIEEE Computer Society
Pages957-964
Number of pages8
ISBN (Electronic)9781728193601
DOIs
Publication statusPublished - 28 Jul 2020
Event2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2020 - Virtual, Online, United States
Duration: 14 Jun 202019 Jun 2020

Publication series

NameIEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
Volume2020-June
ISSN (Print)2160-7508
ISSN (Electronic)2160-7516

Conference

Conference2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2020
Country/TerritoryUnited States
CityVirtual, Online
Period14/06/2019/06/20

Fingerprint

Dive into the research topics of 'M2SGD: Learning to learn important weights'. Together they form a unique fingerprint.

Cite this