On the convergence speed of MDL predictions for bernoulli sequences

Jan Poland*, Marcus Hutter

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

10 Citations (Scopus)

Abstract

We consider the Minimum Description Length principle for online sequence prediction. If the underlying model class is discrete, then the total expected square loss is a particularly interesting performance measure: (a) this quantity is bounded, implying convergence with probability one, and (b) it additionally specifies a rate of convergence. Generally, for MDL only exponential loss bounds hold, as opposed to the linear bounds for a Bayes mixture. We show that this is even the case if the model class contains only Bernoulli distributions. We derive a new upper bound on the prediction error for countable Bernoulli classes. This implies a small bound (comparable to the one for Bayes mixtures) for certain important model classes. The results apply to many Machine Learning tasks including classification and hypothesis testing. We provide arguments that our theorems generalize to countable classes of i.i.d. models.

Original languageEnglish
Pages (from-to)294-308
Number of pages15
JournalLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume3244
DOIs
Publication statusPublished - 2004
Externally publishedYes
Event15th International Conference ALT 2004: Algorithmic Learning Theory - Padova, Italy
Duration: 2 Oct 20045 Oct 2004

Fingerprint

Dive into the research topics of 'On the convergence speed of MDL predictions for bernoulli sequences'. Together they form a unique fingerprint.

Cite this