- Analysis, Markov
- Chains, Markov
- Markoff processes
- Markov analysis
- Markov chains
- Markov models
- Models, Markov
- Processes, Markov
Exact Matching Concepts from Other Schemes
Closely Matching Concepts from Other Schemes
- found: UMI business vocab. (Markov processes, use Markov analysis)
- found: Wikipedia, Jan. 3, 2007 Markov chain (in mathematics, a Markov chain, named after Andrey Markov, is a discrete-time stochastic process with the Markov property; Markov models) Markov process (in probability theory, a Markov process is a stochastic process that has the Markov property; often, the term Markov chain is used to mean a discrete-time Markov process)
- 1986-02-11: new
- 2007-03-22: revised
The LC Linked Data Service welcomes any suggestions you might have about terminology used for a given heading or concept.
Would you like to suggest a change to this heading?
Please provide your name, email, and your suggestion so that we can begin assessing any terminology changes.
Fields denoted with an asterisk (*) are required.