Help


from Wikipedia
« »  
Formally, a Markov chain is a random process with the Markov property.
Often, the term " Markov chain " is used to mean a Markov process which has a discrete ( finite or countable ) state-space.
Usually a Markov chain is defined for a discrete set of times ( i. e., a discrete-time Markov chain ) although some authors use the same terminology where " time " can take continuous values.
The use of the term in Markov chain Monte Carlo methodology covers cases where the process is in discrete time ( discrete algorithm steps ) with a continuous state space.
The following concentrates on the discrete-time discrete-state-space case.

2.065 seconds.