Scoring probability forecasts for point processes: The entropy score and information gain

Daryl J. Daley, David Vere-Jones

    Research output: Contribution to journalArticlepeer-review

    38 Citations (Scopus)

    Abstract

    The entropy score of an observed outcome that has been given a probability forecast p is defined to be .log p. If p is derived from a probability model and there is a background model for which the same outcome has probability π, then the log ratio log(p/π) is the probability gain, and its expected value the information gain, for that outcome. Such concepts are closely related to the likelihood of the model and its entropy rate. The relationships between these concepts are explored in the case that the outcomes in question are the occurrence or nonoccurrence of events in a stochastic point process. It is shown that, in such a context, the mean information gain per unit time, based on forecasts made at arbitrary discrete time intervals, is bounded above by the entropy rate of the point process. Two examples illustrate how the information gain may be related to realizations with a range of values of ‘predictability’.

    Original languageEnglish
    Pages (from-to)297-312
    Number of pages16
    JournalJournal of Applied Probability
    Volume41A
    DOIs
    Publication statusPublished - 2004

    Fingerprint

    Dive into the research topics of 'Scoring probability forecasts for point processes: The entropy score and information gain'. Together they form a unique fingerprint.

    Cite this