dict.org

The DICT Development Group


Search for:
Search type:
Database:

Database copyright information
Server information


3 definitions found
 for Markov process
From The Collaborative International Dictionary of English v.0.48 :

  Markov process \Mark"ov pro`cess\, n. [after A. A. Markov,
     Russian mathematician, b. 1856, d. 1922.] (Statistics)
     a random process in which the probabilities of states in a
     series depend only on the properties of the immediately
     preceding state or the next preceeding state, independent of
     the path by which the preceding state was reached. It is
     distinguished from a Markov chain in that the states of a
     Markov process may be continuous as well as discrete. [Also
     spelled Markoff process.]
     [PJC]

From WordNet (r) 3.0 (2006) :

  Markov process
      n 1: a simple stochastic process in which the distribution of
           future states depends only on the present state and not on
           how it arrived in the present state [syn: Markov process,
           Markoff process]

From The Free On-line Dictionary of Computing (30 December 2018) :

  Markov process
  
      A process in which the sequence of
     events can be described by a Markov chain.
  
     (1995-02-23)
  

Contact=webmaster@dict.org Specification=RFC 2229