Variants of Solomonoff induction

From Machinelearning
Revision as of 02:56, 31 March 2019 by IssaRice (talk | contribs)

This page lists some variants of Solomonoff induction.

For determinism, I think "deterministic" is the same as "Solomonoff prior" and "stochastic" is the same as "universal mixture".

For discrete vs continuous, I think this just means whether the prior we define is over finite strings or over infinite sequences (where we want to know the probability of an infinite sequence starting with a given finite string).

Source Formula Determinism Type of machine used Discrete vs continuous
LessWrong Wiki[1] m(y0)=pP:U(p)=y02(p) where P is the set of self-delimiting programs Deterministic Page doesn't say, but uses self-delimiting programs and it's discrete, so prefix Turing machine? Discrete because the output string y0 is finite
Scholarpedia discrete universal a priori probability[2] m(x)=p:U(p)=x2(p) where the sum is over halting programs deterministic? prefix Turing machine discrete
Scholarpedia continuous universal a priori probability[2] M(x)=p:U(p)=x*2(p) where the sum is over minimal programs deterministic? Monotone Turing machine Continuous
Sterkenburg (p. 22)[3] PI(σ)=limn|Tσ,n|Tn where Tn is the set of all halting (valid) inputs of length n to the reference machine U, Tσ,n is the set of all halting (valid) inputs of length n that output something starting with σ

References

  1. https://wiki.lesswrong.com/wiki/Solomonoff_induction
  2. 2.0 2.1 Marcus Hutter; Shane Legg; Paul M.B. Vitanyi. "Algorithmic probability". Scholarpedia. 2007.
  3. Tom Florian Sterkenburg. "The Foundations of Solomonoff Prediction". February 2013.