Dictionary    Maps    Thesaurus    Translate    Advanced >   


Tip: Click Thesaurus above for synonyms. Also, follow synonym links within the dictionary to find definitions from other sources.

1. WordNet® 3.0 (2006)
Markov process
    n 1: a simple stochastic process in which the distribution of
         future states depends only on the present state and not on
         how it arrived in the present state [syn: Markov process,
         Markoff process]

2. The Collaborative International Dictionary of English v.0.48
Markov process \Mark"ov pro`cess\, n. [after A. A. Markov,
   Russian mathematician, b. 1856, d. 1922.] (Statistics)
   a random process in which the probabilities of states in a
   series depend only on the properties of the immediately
   preceding state or the next preceeding state, independent of
   the path by which the preceding state was reached. It is
   distinguished from a Markov chain in that the states of a
   Markov process may be continuous as well as discrete. [Also
   spelled Markoff process.]
   [PJC]

3. The Free On-line Dictionary of Computing (30 December 2018)
Markov process

    A process in which the sequence of
   events can be described by a Markov chain.

   (1995-02-23)


Common Misspellings >
Most Popular Searches: Define Misanthrope, Define Pulchritudinous, Define Happy, Define Veracity, Define Cornucopia, Define Almuerzo, Define Atresic, Define URL, Definitions Of Words, Definition Of Get Up, Definition Of Quid Pro Quo, Definition Of Irreconcilable Differences, Definition Of Word, Synonyms of Repetitive, Synonym Dictionary, Synonym Antonyms. See our main index and map index for more details.

©2011-2024 ZebraWords.com - Define Yourself - The Search for Meanings and Meaning Means I Mean. All content subject to terms and conditions as set out here. Contact Us, peruse our Privacy Policy