Solomonoff induction

From Machinelearning

Solomonoff induction is an idealized method of sequence prediction.

Variants

There are many variants of Solomonoff induction that seem to all turn out essentially the same, but most resources on Solomonoff induction only talk about one or two of the variants, which makes it pretty confusing for someone trying to learn about it.

Comparison dimensions:

  • Determinism: are your programs probabilistic or deterministic? If deterministic, I think the programs require random coinflips as input.
  • Prediction length: finite vs infinite sequences
  • Solomonoff prior vs universal mixture
  • Model selection vs sequence prediction?
  • Discrete vs continuous? (is this the same as finite vs infinite?)

List of variants considered in various resources:

  • Solomonoff's original paper:
  • Eliezer's Solomonoff induction dialogue:
  • Li and Vitanyi:
  • LessWrong Wiki:
  • Scholarpedia article:
  • Shane Legg's introduction:
  • Sterkenburg's "The Foundations of Solomonoff Prediction":
  • Marcus Hutter's papers...

Significance of random coin flips

Measures vs semimeasures

Turing machines, prefix machines, monotone machines

Applications

  • there are some standard applications you can find standard references
  • carl's analogy to psychophysical laws
  • malign prior stuff