Page "Markov chain" Paragraph 7
from
Wikipedia
A famous Markov chain is the so-called " drunkard's walk ", a random walk on the number line where, at each step, the position may change by + 1 or − 1 with equal probability.
The transition probabilities depend only on the current position, not on the manner in which the position was reached.
For example, the transition probabilities from 5 to 4 and 5 to 6 are both 0. 5, and all other transition probabilities from 5 are 0.
Page 1 of 1.
1.806 seconds.