TY - JOUR
T1 - On universal prediction and Bayesian confirmation
AU - Hutter, Marcus
PY - 2007/9/24
Y1 - 2007/9/24
N2 - The Bayesian framework is a well-studied and successful framework for inductive reasoning, which includes hypothesis testing and confirmation, parameter estimation, sequence prediction, classification, and regression. But standard statistical guidelines for choosing the model class and prior are not always available or can fail, in particular in complex situations. Solomonoff completed the Bayesian framework by providing a rigorous, unique, formal, and universal choice for the model class and the prior. I discuss in breadth how and in which sense universal (non-i.i.d.) sequence prediction solves various (philosophical) problems of traditional Bayesian sequence prediction. I show that Solomonoff's model possesses many desirable properties: strong total and future bounds, and weak instantaneous bounds, and, in contrast to most classical continuous prior densities, it has no zero p(oste)rior problem, i.e. it can confirm universal hypotheses, is reparametrization and regrouping invariant, and avoids the old-evidence and updating problem. It even performs well (actually better) in non-computable environments.
AB - The Bayesian framework is a well-studied and successful framework for inductive reasoning, which includes hypothesis testing and confirmation, parameter estimation, sequence prediction, classification, and regression. But standard statistical guidelines for choosing the model class and prior are not always available or can fail, in particular in complex situations. Solomonoff completed the Bayesian framework by providing a rigorous, unique, formal, and universal choice for the model class and the prior. I discuss in breadth how and in which sense universal (non-i.i.d.) sequence prediction solves various (philosophical) problems of traditional Bayesian sequence prediction. I show that Solomonoff's model possesses many desirable properties: strong total and future bounds, and weak instantaneous bounds, and, in contrast to most classical continuous prior densities, it has no zero p(oste)rior problem, i.e. it can confirm universal hypotheses, is reparametrization and regrouping invariant, and avoids the old-evidence and updating problem. It even performs well (actually better) in non-computable environments.
KW - (non)Computable environments
KW - Bayes
KW - Black raven paradox
KW - Confirmation theory
KW - Kolmogorov complexity
KW - Model classes
KW - Occam's razor
KW - Old-evidence/updating problem
KW - Philosophical issues
KW - Prediction bounds
KW - Reparametrization invariance
KW - Sequence prediction
KW - Solomonoff prior
KW - Symmetry principle
UR - http://www.scopus.com/inward/record.url?scp=34548243292&partnerID=8YFLogxK
U2 - 10.1016/j.tcs.2007.05.016
DO - 10.1016/j.tcs.2007.05.016
M3 - Article
SN - 0304-3975
VL - 384
SP - 33
EP - 48
JO - Theoretical Computer Science
JF - Theoretical Computer Science
IS - 1
ER -