Markov chain
noun Etymology: A. A. Markov died 1922 Russian mathematician Date: 1938 a usually discrete stochastic process (as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved — called also Markoff chain

New Collegiate Dictionary. 2001.

Look at other dictionaries:

  • Markov chain — Mark ov chain, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.] (Statistics) A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding… …   The Collaborative International Dictionary of English

  • Markov chain — A simple two state Markov chain. A Markov chain, named for Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized …   Wikipedia

  • Markov-Chain — Eine Markow Kette (engl. Markov chain, auch Markow Prozess, nach Andrei Andrejewitsch Markow, andere Schreibweisen: Markov Kette, Markoff Kette) ist eine spezielle Klasse von stochastischen Prozessen. Man unterscheidet eine Markow Kette in… …   Deutsch Wikipedia

  • Markov Chain — Eine Markow Kette (engl. Markov chain, auch Markow Prozess, nach Andrei Andrejewitsch Markow, andere Schreibweisen: Markov Kette, Markoff Kette) ist eine spezielle Klasse von stochastischen Prozessen. Man unterscheidet eine Markow Kette in… …   Deutsch Wikipedia

  • Markov chain — Markovo grandis statusas T sritis automatika atitikmenys: angl. Markov chain vok. Markovkette, f rus. марковская цепь, f; цепь Маркова, f pranc. suite markovienne, f …   Automatikos terminų žodynas

  • Markov chain — /mahr kawf/, Statistics. a Markov process restricted to discrete random events or to discontinuous time sequences. Also, Markoff chain. [1940 45; see MARKOV PROCESS] * * * …   Universalium

  • Markov chain — noun a Markov process for which the parameter is discrete time values • Syn: ↑Markoff chain • Hypernyms: ↑Markov process, ↑Markoff process …   Useful english dictionary

  • Markov chain — noun A discrete time stochastic process with the Markov property …   Wiktionary

  • Markov chain geostatistics — refer to the Markov chain models, simulation algorithms and associated spatial correlation measures (e.g., transiogram) based on the Markov chain random field theory, which extends a single Markov chain into a multi dimensional field for… …   Wikipedia

  • Markov Chain Monte Carlo — Verfahren (kurz MCMC Verfahren; seltener auch Markov Ketten Monte Carlo Verfahren) sind eine Klasse von Algorithmen, die Stichproben aus Wahrscheinlichkeitsverteilungen ziehen. Dies geschieht auf der Basis der Konstruktion einer Markow Kette,… …   Deutsch Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”