Markov chain

noun

: a usually discrete stochastic process (such as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved

called also Markoff chain

Examples of Markov chain in a Sentence

Recent Examples on the Web
These examples are automatically compiled from online sources to illustrate current usage. Opinions expressed in the examples do not represent those of Merriam-Webster or its editors. Send us feedback about these examples.
Similar to the way that calculators for high school algebra work, Markov chain Monte Carlo takes in a mathematical equation from the user and figures out the values of each variable that maximize the predictive accuracy of the model. G. Elliott Morris, ABC News, 23 Oct. 2024 That model uses Markov chain Monte Carlo to simulate tens of thousands of different ways the election could go, each time varying the hundreds of parameters in our model. G. Elliott Morris, ABC News, 11 June 2024 Review farms used to use Markov chain generators—an algorithm that can create rudimentary sentences by using common phrases and probability to predict sentence structures. Simon Hill, WIRED, 2 Nov. 2022 There are also quantum walks equivalent to the Markov chain and Markov walks. Bhagvan Kommadi, Forbes, 3 Aug. 2022 The game ostensibly hinges on chance due to its reliance on dice, but in actuality, Chutes and Ladders is a stochastic system called an absorbing Markov chain, meaning just a portion of the system contains randomness. Leila Sloman, Popular Mechanics, 28 Feb. 2022 The results were mixed: Their model predicted the movement of cabs better than a usual Markov chain, but neither model was very reliable. Quanta Magazine, 19 Aug. 2021 Nowadays, the Markov chain is a fundamental tool for exploring spaces of conceptual entities much more general than poems. Jordan Ellenberg, Scientific American, 11 June 2021 Application of Approximate Bayesian Computation Markov chain Monte Carlo based to these sequence data using a simple forward simulator revealed broad posterior distributions of the selective parameters for all four genes, providing no support for positive selection. Razib Khan, Discover Magazine, 9 Mar. 2012

Word History

Etymology

A. A. Markov †1922 Russian mathematician

First Known Use

1938, in the meaning defined above

Time Traveler
The first known use of Markov chain was in 1938

Dictionary Entries Near Markov chain

Cite this Entry

“Markov chain.” Merriam-Webster.com Dictionary, Merriam-Webster, https://www.merriam-webster.com/dictionary/Markov%20chain. Accessed 5 Nov. 2024.

More from Merriam-Webster on Markov chain

Last Updated: - Updated example sentences
Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!