dict.org

The DICT Development Group


Search for:
Search type:
Database:

Database copyright information
Server information


3 definitions found
 for Markov chain
From The Collaborative International Dictionary of English v.0.48 :

  Markov chain \Mark"ov chain\, n. [after A. A. Markov, Russian
     mathematician, b. 1856, d. 1922.] (Statistics)
     A random process (Markov process) in which the probabilities
     of discrete states in a series depend only on the properties
     of the immediately preceding state or the next preceeding
     state, independent of the path by which the preceding state
     was reached. It differs from the more general Markov process
     in that the states of a Markov chain are discrete rather than
     continuous. Certain physical processes, such as diffusion of
     a molecule in a fluid, are modelled as a Markov chain. See
     also random walk. [Also spelled Markoff chain.]
     [PJC]

From WordNet (r) 3.0 (2006) :

  Markov chain
      n 1: a Markov process for which the parameter is discrete time
           values [syn: Markov chain, Markoff chain]

From The Free On-line Dictionary of Computing (30 December 2018) :

  Markov chain
  
      (Named after Andrei Markov) A model of
     sequences of events where the probability of an event
     occurring depends upon the fact that a preceding event
     occurred.
  
     A Markov process is governed by a Markov chain.
  
     In simulation, the principle of the Markov chain is applied
     to the selection of samples from a probability density
     function to be applied to the model.  Simscript II.5 uses
     this approach for some modelling functions.
  
     [Better explanation?]
  
     (1995-02-23)
  

Contact=webmaster@dict.org Specification=RFC 2229