Efficient Hyperparameter Tuning with Dynamic Accuracy Derivative-Free Optimization

Lindon Roberts, Matthias Ehrhardt

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    Abstract

    Many machine learning solutions are framed as optimization problems which rely on good hyperparameters. Algorithms for tuning these hyperparameters usually assume access to exact solutions to the underlying learning problem, which is typically not practical. Here, we apply a recent dynamic accuracy derivative-free optimization method to hyperparameter tuning, which allows inexact evaluations of the learning problem while retaining convergence guarantees. We test the method on the problem of learning elastic net weights for a logistic classifier, and demonstrate its robustness and efficiency compared to a fixed accuracy approach. This demonstrates a promising approach for hyperparameter tuning, with both convergence guarantees and practical performance.
    Original languageEnglish
    Title of host publicationAdvances in Neural Information Processing Systems
    EditorsH. Larochelle, M. Ranzato, R. Hadsell, M.F. Balcan & H. Lin
    Place of PublicationUnited States
    PublisherNeural Information Processing Systems Foundation
    ISBN (Print)9781713829546
    Publication statusPublished - 2020
    Event34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Vancouver, Canada, Virtual
    Duration: 1 Jan 2020 → …
    https://proceedings.neurips.cc/paper/2020

    Conference

    Conference34th Conference on Neural Information Processing Systems, NeurIPS 2020
    Period1/01/20 → …
    OtherDecember 6-12 2020
    Internet address

    Fingerprint

    Dive into the research topics of 'Efficient Hyperparameter Tuning with Dynamic Accuracy Derivative-Free Optimization'. Together they form a unique fingerprint.

    Cite this