Markov source

Related Overviews


'Markov source' can also refer to...


More Like This

Show all results sharing this subject:

  • Computing


Show Summary Details

Quick Reference

A Markov chain, whose random variables are regarded as internal states, together with a mapping from these internal states to the symbols of some external alphabet. The mapping need not be a bijection. A Markov source is ergodic if and only if its underlying Markov chain is ergodic. See also discrete source.

Subjects: Computing.

Reference entries

Users without a subscription are not able to see the full content. Please, subscribe or login to access all content.