Word
Markov chain, n.
a Markov process for which the parameter is discrete time values
<noun.process>
Markoff chain
Markov chain
semantic pointers
hypernym
Markov process, Markoff process