Markov process

From Wikipedia, the free encyclopedia

Jump to: navigation, search

A Markov process, named after the Russian mathematician Andrey Markov, is a mathematical model for the random evolution of a memoryless system, that is, one for which the likelihood of a given future state, at any given moment, depends only on its present state, and not on any past states.

In a common description, a stochastic process with the Markov property, or memorylessness, is one for which conditional on the present state of the system, its future and past are independent[citation needed].

Often, the term Markov chain is used to mean a Markov process which has a discrete (finite or countable) state-space. Usually a Markov chain would be defined for a discrete set of times (i.e. a discrete-time Markov Chain)[1] although some authors use the same terminology where "time" can take continuous values.[2] Also see continuous-time Markov process.

Contents

[edit] Formal definition

A stochastic process whose state at time t is X(t), for t > 0, and whose history of states is given by x(s) for times s < t is a Markov process if

\mathrm{Pr}\big[X(t+h) = y \mid X(s) = x(s), \forall s \leq t\big] = \mathrm{Pr}\big[X(t+h) = y \mid X(t) = x(t)\big], \quad \forall h > 0.

That is, the probability of its having state y at time t+h, conditioned on having the particular state x(t) at time t, is equal to the conditional probability of its having that same state y but conditioned on its value for all previous times before t. This captures the idea that its future state is independent of its past states.

Markov processes are typically termed (time-) homogeneous if

\mathrm{Pr}\big[X(t+h) = y \mid X(t) = x\big] = \mathrm{Pr}\big[X(h) = y \mid X(0) = x\big], \quad \forall t, h > 0,

and otherwise are termed (time-) inhomogeneous (or (time-) nonhomogeneous). Homogeneous Markov processes, usually being simpler than inhomogeneous ones, form the most important class of Markov processes.

[edit] Markovian representations

In some cases, apparently non-Markovian processes may still have Markovian representations, constructed by expanding the concept of the 'current' and 'future' states. For example, let X be a non-Markovian process. Then define a process Y, such that each state of Y represents a time-interval of states of X, i.e. mathematically,

Y(t) = \big\{ X(s): s \in [a(t), b(t)] \, \big\}.

If Y has the Markov property, then it is a Markovian representation of X. In this case, X is also called a second-order Markov process. Higher-order Markov processes are defined analogously.

An example of a non-Markovian process with a Markovian representation is a moving average time series.

[edit] See also

[edit] References

  1. ^ Everitt,B.S. (2002) The Cambridge Dictionary of Statistics. CUP. ISBN 0-521-81099-x
  2. ^ Dodge, Y. The Oxford Dictionary of Statistical Terms, OUP. ISBN 0-19-920613-9


This probability-related article is a stub. You can help Wikipedia by expanding it.
Personal tools