An analysis of the anti-learning phenomenon for the class symmetric polyhedron

Adam Kowalczyk*, Olivier Chapelle

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    9 Citations (Scopus)

    Abstract

    This paper deals with an unusual phenomenon where most machine learning algorithms yield good performance on the training set but systematically worse than random performance on the test set. This has been observed so far for some natural data sets and demonstrated for some synthetic data sets when the classification rule is learned from a small set of training samples drawn from some high dimensional space. The initial analysis presented in this paper shows that anti-learning is a property of data sets and is quite distinct from over-fitting of a training data. Moreover, the analysis leads to a specification of some machine learning procedures which can overcome anti-learning and generate machines able to classify training and test data consistently.

    Original languageEnglish
    Title of host publicationAlgorithmic Learning Theory - 16th International Conference, ALT 2005, Proceedings
    Pages78-91
    Number of pages14
    DOIs
    Publication statusPublished - 2005
    Event16th International Conference on Algorithmic Learning Theory, ALT 2005 - Singapore, Singapore
    Duration: 8 Oct 200511 Oct 2005

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    Volume3734 LNAI
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349

    Conference

    Conference16th International Conference on Algorithmic Learning Theory, ALT 2005
    Country/TerritorySingapore
    CitySingapore
    Period8/10/0511/10/05

    Fingerprint

    Dive into the research topics of 'An analysis of the anti-learning phenomenon for the class symmetric polyhedron'. Together they form a unique fingerprint.

    Cite this