1.II.19C

Markov Chains | Part IB, 2006

Explain what is meant by a stopping time of a Markov chain (Xn)n0\left(X_{n}\right)_{n \geq 0}. State the strong Markov property.

Show that, for any state ii, the probability, starting from ii, that (Xn)n0\left(X_{n}\right)_{n \geq 0} makes infinitely many visits to ii can take only the values 0 or 1 .

Show moreover that, if

n=0Pi(Xn=i)=\sum_{n=0}^{\infty} \mathbb{P}_{i}\left(X_{n}=i\right)=\infty

then (Xn)n0\left(X_{n}\right)_{n \geq 0} makes infinitely many visits to ii with probability 1.1 .

Typos? Please submit corrections to this page on GitHub.