Home

Word

Markov chain, n.
more info a Markov process for which the parameter is discrete time values
<noun.process>

Markoff chain Markov chain
semantic pointers
hypernym