A3.1 B3.1

Markov Chains | Part II, 2001

(i) Explain what is meant by the transition semigroup {Pt}\left\{P_{t}\right\} of a Markov chain XX in continuous time. If the state space is finite, show under assumptions to be stated clearly, that Pt=GPtP_{t}^{\prime}=G P_{t} for some matrix GG. Show that a distribution π\pi satisfies πG=0\pi G=0 if and only if πPt=π\pi P_{t}=\pi for all t0t \geqslant 0, and explain the importance of such π\pi.

(ii) Let XX be a continuous-time Markov chain on the state space S={1,2}S=\{1,2\} with generator

G=(ββγγ), where β,γ>0.G=\left(\begin{array}{cc} -\beta & \beta \\ \gamma & -\gamma \end{array}\right), \quad \text { where } \beta, \gamma>0 .

Show that the transition semigroup Pt=exp(tG)P_{t}=\exp (t G) is given by

(β+γ)Pt=(γ+βh(t)β(1h(t))γ(1h(t))β+γh(t)),(\beta+\gamma) P_{t}=\left(\begin{array}{cc} \gamma+\beta h(t) & \beta(1-h(t)) \\ \gamma(1-h(t)) & \beta+\gamma h(t) \end{array}\right),

where h(t)=et(β+γ)h(t)=e^{-t(\beta+\gamma)}.

For 0<α<10<\alpha<1, let

H(α)=(α1α1αα)H(\alpha)=\left(\begin{array}{cc} \alpha & 1-\alpha \\ 1-\alpha & \alpha \end{array}\right)

For a continuous-time chain XX, let MM be a matrix with (i,j)(i, j) entry

P(X(1)=jX(0)=i)P(X(1)=j \mid X(0)=i), for i,jSi, j \in S. Show that there is a chain XX with M=H(α)M=H(\alpha) if and only if α>12\alpha>\frac{1}{2}.

Typos? Please submit corrections to this page on GitHub.