Concentration and confidence for discrete Bayesian sequence predictors

Tor Lattimore, Marcus Hutter, Peter Sunehag

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    4 Citations (Scopus)

    Abstract

    Bayesian sequence prediction is a simple technique for predicting future symbols sampled from an unknown measure on infinite sequences over a countable alphabet. While strong bounds on the expected cumulative error are known, there are only limited results on the distribution of this error. We prove tight high-probability bounds on the cumulative error, which is measured in terms of the Kullback-Leibler (KL) divergence. We also consider the problem of constructing upper confidence bounds on the KL and Hellinger errors similar to those constructed from Hoeffding-like bounds in the i.i.d. case. The new results are applied to show that Bayesian sequence prediction can be used in the Knows What It Knows (KWIK) framework with bounds that match the state-of-the-art.

    Original languageEnglish
    Title of host publicationAlgorithmic Learning Theory - 24th International Conference, ALT 2013, Proceedings
    Pages324-338
    Number of pages15
    DOIs
    Publication statusPublished - 2013
    Event24th International Conference on Algorithmic Learning Theory, ALT 2013 - Singapore, Singapore
    Duration: 6 Oct 20139 Oct 2013

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    Volume8139 LNAI
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349

    Conference

    Conference24th International Conference on Algorithmic Learning Theory, ALT 2013
    Country/TerritorySingapore
    CitySingapore
    Period6/10/139/10/13

    Fingerprint

    Dive into the research topics of 'Concentration and confidence for discrete Bayesian sequence predictors'. Together they form a unique fingerprint.

    Cite this