Depth Dropout: Efficient Training of Residual Convolutional Neural Networks

Jian Guo, Stephen Gould

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    8 Citations (Scopus)

    Abstract

    Training state-of-the-art deep neural networks is computationally expensive and time consuming. In this paper we present a method that can reduce training time while at the same time maintain nearly the same accuracy as traditional training approaches. This allows for faster experimentation and better use of computational resource. Our method extends the well-known dropout technique by randomly removing entire network layers instead of individual neurons during training and hence reducing the number of expensive convolution operations needed per training iteration. We conduct experiments on object recognition using the CIFAR10 and ImageNet datasets to demonstrate the effectiveness of our approach. Our results show that we can train residual convolutional neural networks (ResNets) 17.5% faster with only 0.4% decrease in error rate or 34.1% faster with 1.3% increase in error rate compared to a baseline model. We also perform analysis on the trade-off between testing accuracy and training speedup as a function of the drop-out ratio.

    Original languageEnglish
    Title of host publication2016 International Conference on Digital Image Computing
    Subtitle of host publicationTechniques and Applications, DICTA 2016
    EditorsAlan Wee-Chung Liew, Jun Zhou, Yongsheng Gao, Zhiyong Wang, Clinton Fookes, Brian Lovell, Michael Blumenstein
    PublisherInstitute of Electrical and Electronics Engineers Inc.
    ISBN (Electronic)9781509028962
    DOIs
    Publication statusPublished - 22 Dec 2016
    Event2016 International Conference on Digital Image Computing: Techniques and Applications, DICTA 2016 - Gold Coast, Australia
    Duration: 30 Nov 20162 Dec 2016

    Publication series

    Name2016 International Conference on Digital Image Computing: Techniques and Applications, DICTA 2016

    Conference

    Conference2016 International Conference on Digital Image Computing: Techniques and Applications, DICTA 2016
    Country/TerritoryAustralia
    CityGold Coast
    Period30/11/162/12/16

    Fingerprint

    Dive into the research topics of 'Depth Dropout: Efficient Training of Residual Convolutional Neural Networks'. Together they form a unique fingerprint.

    Cite this