Markov chain
noun
: a usually discrete stochastic process (such as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved
called also Markoff chain
Love words? Need even more definitions?
Merriam-Webster unabridged
Share