Overview

Markov chain


Show Summary Details

Quick Reference

A sequence of discrete random variables such that each member of the sequence is probabilistically dependent only on its predecessor. An ergodic Markov chain has the property that its value at any time has the same statistical properties as its value at any other time.

Subjects: Probability and Statistics.


Reference entries

See all related reference entries in Oxford Index »


Users without a subscription are not able to see the full content. Please, subscribe or login to access all content.