• # Paper 1, Section II, 28K

The particles of an Ideal Gas form a spatial Poisson process on $\mathbb{R}^{3}$ with constant intensity $z>0$, called the activity of the gas.

(a) Prove that the independent mixture of two Ideal Gases with activities $z_{1}$ and $z_{2}$ is again an Ideal Gas. What is its activity? [You must prove any results about Poisson processes that you use. The independent mixture of two gases with particles $\Pi_{1} \subset \mathbb{R}^{3}$ and $\Pi_{2} \subset \mathbb{R}^{3}$ is given by $\left.\Pi_{1} \cup \Pi_{2} .\right]$

(b) For an Ideal Gas of activity $z>0$, find the limiting distribution of

$\frac{N\left(V_{i}\right)-\mathbb{E} N\left(V_{i}\right)}{\sqrt{\left|V_{i}\right|}}$

as $i \rightarrow \infty$ for a given sequence of subsets $V_{i} \subset \mathbb{R}^{3}$ with $\left|V_{i}\right| \rightarrow \infty$.

(c) Let $g: \mathbb{R}^{3} \rightarrow \mathbb{R}$ be a smooth non-negative function vanishing outside a bounded subset of $\mathbb{R}^{3}$. Find the mean and variance of $\sum_{x} g(x)$, where the sum runs over the particles $x \in \mathbb{R}^{3}$ of an ideal gas of activity $z>0$. [You may use the properties of spatial Poisson processes established in the lectures.]

[Hint: recall that the characteristic function of a Poisson random variable with mean $\lambda$ is $\left.e^{\left(e^{i t}-1\right) \lambda} \cdot\right]$

comment
• # Paper 2, Section II, $28 K$

Let $X$ be an irreducible, non-explosive, continuous-time Markov process on the state space $\mathbb{Z}$ with generator $Q=\left(q_{x, y}\right)_{x, y \in \mathbb{Z}}$.

(a) Define its jump chain $Y$ and prove that it is a discrete-time Markov chain.

(b) Define what it means for $X$ to be recurrent and prove that $X$ is recurrent if and only if its jump chain $Y$ is recurrent. Prove also that this is the case if the transition semigroup $\left(p_{x, y}(t)\right)$ satisfies

$\int_{0}^{\infty} p_{0,0}(t) d t=\infty$

(c) Show that $X$ is recurrent for at least one of the following generators:

$\begin{array}{cc} q_{x, y}=(1+|x|)^{-2} e^{-|x-y|^{2}} & (x \neq y), \\ q_{x, y}=(1+|x-y|)^{-2} e^{-|x|^{2}} & (x \neq y) . \end{array}$

[Hint: You may use that the semigroup associated with a $Q$-matrix on $\mathbb{Z}$ such that $q_{x, y}$ depends only on $x-y$ (and has sufficient decay) can be written as

$p_{x, y}(t)=\frac{1}{2 \pi} \int_{-\pi}^{\pi} e^{-t \lambda(k)} e^{i k(x-y)} d k$

where $\lambda(k)=\sum_{y} q_{0, y}\left(1-e^{i k y}\right)$. You may also find the bound $1-\cos x \leqslant x^{2} / 2$ useful. $]$

comment
• # Paper 3, Section II, $27 \mathrm{~K}$

(a) Customers arrive at a queue at the event times of a Poisson process of rate $\lambda$. The queue is served by two independent servers with exponential service times with parameter $\mu$ each. If the queue has length $n$, an arriving customer joins with probability $r^{n}$ and leaves otherwise (where $\left.r \in(0,1]\right)$. For which $\lambda>0, \mu>0$ and $r \in(0,1]$ is there a stationary distribution?

(b) A supermarket allows a maximum of $N$ customers to shop at the same time. Customers arrive at the event times of a Poisson process of rate 1 , they enter the supermarket when possible, and they leave forever for another supermarket otherwise. Customers already in the supermarket pay and leave at the event times of an independent Poisson process of rate $\mu$. When is there a unique stationary distribution for the number of customers in the supermarket? If it exists, find it.

(c) In the situation of part (b), started from equilibrium, show that the departure process is Poissonian.

comment
• # Paper 4, Section II, $27 \mathrm{~K}$

Let $(X(t))_{t \geqslant 0}$ be a continuous-time Markov process with state space $I=\{1, \ldots, n\}$ and generator $Q=\left(q_{i j}\right)_{i, j \in I}$ satisfying $q_{i j}=q_{j i}$ for all $i, j \in I$. The local time up to time $t>0$ of $X$ is the random vector $L(t)=\left(L_{i}(t)\right)_{i \in I} \in \mathbb{R}^{n}$ defined by

$L_{i}(t)=\int_{0}^{t} 1_{X(s)=i} d s \quad(i \in I)$

(a) Let $f: I \times \mathbb{R}^{n} \rightarrow \mathbb{R}$ be any function that is differentiable with respect to its second argument, and set

$f_{t}(i, \ell)=\mathbb{E}_{i} f(X(t), \ell+L(t)), \quad\left(i \in I, \ell \in \mathbb{R}^{n}\right)$

Show that

$\frac{\partial}{\partial t} f_{t}(i, \ell)=M f_{t}(i, \ell)$

where

$M f(i, \ell)=\sum_{j \in I} q_{i j} f(j, \ell)+\frac{\partial}{\partial \ell_{i}} f(i, \ell)$

(b) For $y \in \mathbb{R}^{n}$, write $y^{2}=\left(y_{i}^{2}\right)_{i \in I} \in[0, \infty)^{n}$ for the vector of squares of the components of $y$. Let $f: I \times \mathbb{R}^{n} \rightarrow \mathbb{R}$ be a function such that $f(i, \ell)=0$ whenever $\sum_{j}\left|\ell_{j}\right| \geqslant T$ for some fixed $T$. Using integration by parts, or otherwise, show that for all $i$

$-\int_{\mathbb{R}^{n}} \exp \left(\frac{1}{2} y^{T} Q y\right) y_{i} \sum_{j=1}^{n} y_{j} M f\left(j, \frac{1}{2} y^{2}\right) d y=\int_{\mathbb{R}^{n}} \exp \left(\frac{1}{2} y^{T} Q y\right) f\left(i, \frac{1}{2} y^{2}\right) d y,$

where $y^{T} Q y$ denotes $\sum_{k, m \in I} y_{k} q_{k m} y_{m}$.

(c) Let $g: \mathbb{R}^{n} \rightarrow \mathbb{R}$ be a function with $g(\ell)=0$ whenever $\sum_{j}\left|\ell_{j}\right| \geqslant T$ for some fixed $T$. Given $t>0, j \in I$, now let

$f(i, \ell)=\mathbb{E}_{i}\left[g(\ell+L(t)) 1_{X(t)=j}\right]$

in part (b) and deduce, using part (a), that

\begin{aligned} \int_{\mathbb{R}^{n}} \exp \left(\frac{1}{2} y^{T} Q y\right) y_{i} y_{j} g\left(\frac{1}{2} y^{2}\right) d y \\ &=\int_{\mathbb{R}^{n}} \exp \left(\frac{1}{2} y^{T} Q y\right)\left(\int_{0}^{\infty} \mathbb{E}_{i}\left[1_{X(t)=j} g\left(\frac{1}{2} y^{2}+L(t)\right)\right] d t\right) d y \end{aligned}

[You may exchange the order of integrals and derivatives without justification.]

comment

• # Paper 1, Section II, 28K

(a) What is meant by a birth process $N=(N(t): t \geqslant 0)$ with strictly positive rates $\lambda_{0}, \lambda_{1}, \ldots ?$ Explain what is meant by saying that $N$ is non-explosive.

(b) Show that $N$ is non-explosive if and only if

$\sum_{n=0}^{\infty} \frac{1}{\lambda_{n}}=\infty$

(c) Suppose $N(0)=0$, and $\lambda_{n}=\alpha n+\beta$ where $\alpha, \beta>0$. Show that

$\mathbb{E}(N(t))=\frac{\beta}{\alpha}\left(e^{\alpha t}-1\right) .$

comment
• # Paper 2, Section II, 27K

(i) Let $X$ be a Markov chain in continuous time on the integers $\mathbb{Z}$ with generator $\mathbf{G}=\left(g_{i, j}\right)$. Define the corresponding jump chain $Y$.

Define the terms irreducibility and recurrence for $X$. If $X$ is irreducible, show that $X$ is recurrent if and only if $Y$ is recurrent.

(ii) Suppose

$g_{i, i-1}=3^{|i|}, \quad g_{i, i}=-3^{|i|+1}, \quad g_{i, i+1}=2 \cdot 3^{|i|}, \quad i \in \mathbb{Z} .$

Show that $X$ is transient, find an invariant distribution, and show that $X$ is explosive. [Any general results may be used without proof but should be stated clearly.]

comment
• # Paper 3, Section II, 27K

Define a renewal-reward process, and state the renewal-reward theorem.

A machine $M$ is repaired at time $t=0$. After any repair, it functions without intervention for a time that is exponentially distributed with parameter $\lambda$, at which point it breaks down (assume the usual independence). Following any repair at time $T$, say, it is inspected at times $T, T+m, T+2 m, \ldots$, and instantly repaired if found to be broken (the inspection schedule is then restarted). Find the long run proportion of time that $M$ is working. [You may express your answer in terms of an integral.]

comment
• # Paper 4, Section II, K

(i) Explain the notation $\mathrm{M}(\lambda) / \mathrm{M}(\mu) / 1$ in the context of queueing theory. [In the following, you may use without proof the fact that $\pi_{n}=(\lambda / \mu)^{n}$ is the invariant distribution of such a queue when $0<\lambda<\mu$.

(ii) In a shop queue, some customers rejoin the queue after having been served. Let $\lambda, \beta \in(0, \infty)$ and $\delta \in(0,1)$. Consider a $\mathrm{M}(\lambda) / \mathrm{M}(\mu) / 1$ queue subject to the modification that, on completion of service, each customer leaves the shop with probability $\delta$, or rejoins the shop queue with probability $1-\delta$. Different customers behave independently of one another, and all service times are independent random variables.

Find the distribution of the total time a given customer spends being served by the server. Hence show that equilibrium is possible if $\lambda<\delta \mu$, and find the invariant distribution of the queue-length in this case.

(iii) Show that, in equilibrium, the departure process is Poissonian, whereas, assuming the rejoining customers go to the end of the queue, the process of customers arriving at the queue (including the rejoining ones) is not Poissonian.

comment

• # Paper 1, Section II, K

Let $S$ be a countable set, and let $P=\left(p_{i, j}: i, j \in S\right)$ be a Markov transition matrix with $p_{i, i}=0$ for all $i$. Let $Y=\left(Y_{n}: n=0,1,2, \ldots\right)$ be a discrete-time Markov chain on the state space $S$ with transition matrix $P$.

The continuous-time process $X=\left(X_{t}: t \geqslant 0\right)$ is constructed as follows. Let $\left(U_{m}: m=0,1,2, \ldots\right)$ be independent, identically distributed random variables having the exponential distribution with mean 1. Let $g$ be a function on $S$ such that $\varepsilon for all $i \in S$ and some constant $\varepsilon>0$. Let $V_{m}=U_{m} / g\left(Y_{m}\right)$ for $m \geqslant 0$. Let $T_{0}=0$ and $T_{n}=\sum_{m=0}^{n-1} V_{m}$ for $n \geqslant 1$. Finally, let $X_{t}=Y_{n}$ for $T_{n} \leqslant t.

(a) Explain briefly why $X$ is a continuous-time Markov chain on $S$, and write down its generator in terms of $P$ and the vector $g=(g(i): i \in S)$.

(b) What does it mean to say that the chain $X$ is irreducible? What does it mean to say a state $i \in S$ is (i) recurrent and (ii) positive recurrent?

(c) Show that

(i) $X$ is irreducible if and only if $Y$ is irreducible;

(ii) $X$ is recurrent if and only if $Y$ is recurrent.

(d) Suppose $Y$ is irreducible and positive recurrent with invariant distribution $\pi$. Express the invariant distribution of $X$ in terms of $\pi$ and $g$.

comment
• # Paper 2, Section II, K

Let $X=\left(X_{t}: t \geqslant 0\right)$ be a Markov chain on the non-negative integers with generator $G=\left(g_{i, j}\right)$ given by

\begin{aligned} g_{i, i+1} &=\lambda_{i}, & & i \geqslant 0 \\ g_{i, 0} &=\lambda_{i} \rho_{i}, & & i>0 \\ g_{i, j} &=0, & & j \neq 0, i, i+1 \end{aligned}

for a given collection of positive numbers $\lambda_{i}, \rho_{i}$.

(a) State the transition matrix of the jump chain $Y$ of $X$.

(b) Why is $X$ not reversible?

(c) Prove that $X$ is transient if and only if $\prod_{i}\left(1+\rho_{i}\right)<\infty$.

(d) Assume that $\prod_{i}\left(1+\rho_{i}\right)<\infty$. Derive a necessary and sufficient condition on the parameters $\lambda_{i}, \rho_{i}$ for $X$ to be explosive.

(e) Derive a necessary and sufficient condition on the parameters $\lambda_{i}, \rho_{i}$ for the existence of an invariant measure for $X$.

[You may use any general results from the course concerning Markov chains and pure birth processes so long as they are clearly stated.]

comment
• # Paper 3, Section II, K

(a) What does it mean to say that a continuous-time Markov chain $X=\left(X_{t}: 0 \leqslant\right.$ $t \leqslant T$ ) with state space $S$ is reversible in equilibrium? State the detailed balance equations, and show that any probability distribution on $S$ satisfying them is invariant for the chain.

(b) Customers arrive in a shop in the manner of a Poisson process with rate $\lambda>0$. There are $s$ servers, and capacity for up to $N$ people waiting for service. Any customer arriving when the shop is full (in that the total number of customers present is $N+s$ ) is not admitted and never returns. Service times are exponentially distributed with parameter $\mu>0$, and they are independent of one another and of the arrivals process. Describe the number $X_{t}$ of customers in the shop at time $t$ as a Markov chain.

Calculate the invariant distribution $\pi$ of $X=\left(X_{t}: t \geqslant 0\right)$, and explain why $\pi$ is the unique invariant distribution. Show that $X$ is reversible in equilibrium.

[Any general result from the course may be used without proof, but must be stated clearly.]

comment
• # Paper 4, Section II, K

(a) Let $\lambda: \mathbb{R}^{d} \rightarrow[0, \infty)$ be such that $\Lambda(A):=\int_{A} \lambda(\mathbf{x}) d \mathbf{x}$ is finite for any bounded measurable set $A \subseteq \mathbb{R}^{d}$. State the properties which define a (non-homogeneous) Poisson process $\Pi$ on $\mathbb{R}^{d}$ with intensity function $\lambda$.

(b) Let $\Pi$ be a Poisson process on $\mathbb{R}^{d}$ with intensity function $\lambda$, and let $f: \mathbb{R}^{d} \rightarrow \mathbb{R}^{s}$ be a given function. Give a clear statement of the necessary conditions on the pair $\Lambda, f$ subject to which $f(\Pi)$ is a Poisson process on $\mathbb{R}^{s}$. When these conditions hold, express the mean measure of $f(\Pi)$ in terms of $\Lambda$ and $f$.

(c) Let $\Pi$ be a homogeneous Poisson process on $\mathbb{R}^{2}$ with constant intensity 1 , and let $f: \mathbb{R}^{2} \rightarrow[0, \infty)$ be given by $f\left(x_{1}, x_{2}\right)=x_{1}^{2}+x_{2}^{2}$. Show that $f(\Pi)$ is a homogeneous Poisson process on $[0, \infty)$ with constant intensity $\pi$.

Let $R_{1}, R_{2}, \ldots$ be an increasing sequence of positive random variables such that the points of $f(\Pi)$ are $R_{1}^{2}, R_{2}^{2}, \ldots$ Show that $R_{k}$ has density function

$h_{k}(r)=\frac{1}{(k-1) !} 2 \pi r\left(\pi r^{2}\right)^{k-1} e^{-\pi r^{2}}, \quad r>0$

comment

• # Paper 1, Section II, J

Let $\lambda:[0, \infty) \rightarrow(0, \infty)$ be a continuous function. Explain what is meant by an inhomogeneous Poisson process with intensity function $\lambda$.

Let $\left(N_{t}: t \geqslant 0\right)$ be such an inhomogeneous Poisson process, and let $M_{t}=N_{g(t)}$ where $g:[0, \infty) \rightarrow[0, \infty)$ is strictly increasing, differentiable and satisfies $g(0)=0$. Show that $M$ is a homogeneous Poisson process with intensity 1 if $\Lambda(g(t))=t$ for all $t$, where $\Lambda(t)=\int_{0}^{t} \lambda(u) d u$. Deduce that $N_{t}$ has the Poisson distribution with mean $\Lambda(t)$.

Bicycles arrive at the start of a long road in the manner of a Poisson process $N=\left(N_{t}: t \geqslant 0\right)$ with constant intensity $\lambda$. The $i$ th bicycle has constant velocity $V_{i}$, where $V_{1}, V_{2}, \ldots$ are independent, identically distributed random variables, which are independent of $N$. Cyclists can overtake one another freely. Show that the number of bicycles on the first $x$ miles of the road at time $t$ has the Poisson distribution with parameter $\lambda \mathbb{E}\left(V^{-1} \min \{x, V t\}\right)$.

comment
• # Paper 2, Section II, J

Let $X=\left(X_{t}: t \geqslant 0\right)$ be a continuous-time Markov chain on the finite state space $S$. Define the terms generator (or Q-matrix) and invariant distribution, and derive an equation that links the generator $G$ and any invariant distribution $\pi$. Comment on the possible non-uniqueness of invariant distributions.

Suppose $X$ is irreducible, and let $N=\left(N_{t}: t \geqslant 0\right)$ be a Poisson process with intensity $\lambda$, that is independent of $X$. Let $Y_{n}$ be the value of $X$ immediately after the $n$th arrival-time of $N$ (and $\left.Y_{0}=X_{0}\right)$. Show that $\left(Y_{n}: n=0,1, \ldots\right)$ is a discrete-time Markov chain, state its transition matrix and prove that it has the same invariant distribution as $X$.

comment
• # Paper 3, Section II, J

Individuals arrive in a shop in the manner of a Poisson process with intensity $\lambda$, and they await service in the order of their arrival. Their service times are independent, identically distributed random variables $S_{1}, S_{2}, \ldots$. For $n \geqslant 1$, let $Q_{n}$ be the number remaining in the shop immediately after the $n$th departure. Show that

$Q_{n+1}=A_{n}+Q_{n}-h\left(Q_{n}\right)$

where $A_{n}$ is the number of arrivals during the $(n+1)$ th service period, and $h(x)=$ $\min \{1, x\}$.

Show that

$\mathbb{E}\left(A_{n}\right)=\rho, \quad \mathbb{E}\left(A_{n}^{2}\right)=\rho+\lambda^{2} \mathbb{E}\left(S^{2}\right),$

where $S$ is a typical service period, and $\rho=\lambda \mathbb{E}(S)$ is the traffic intensity of the queue.

Suppose $\rho<1$, and the queue is in equilibrium in the sense that $Q_{n}$ and $Q_{n+1}$ have the same distribution for all $n$. Express $\mathbb{E}\left(Q_{n}\right)$ in terms of $\lambda, \mathbb{E}(S), \mathbb{E}\left(S^{2}\right)$. Deduce that the mean waiting time (prior to service) of a typical individual is $\frac{1}{2} \lambda \mathbb{E}\left(S^{2}\right) /(1-\rho)$.

comment
• # Paper 4, Section II, J

Let $X_{1}, X_{2}, \ldots$ be independent, identically distributed random variables with finite mean $\mu$. Explain what is meant by saying that the random variable $M$ is a stopping time with respect to the sequence $\left(X_{i}: i=1,2, \ldots\right)$.

Let $M$ be a stopping time with finite mean $\mathbb{E}(M)$. Prove Wald's equation:

$\mathbb{E}\left(\sum_{i=1}^{M} X_{i}\right)=\mu \mathbb{E}(M)$

[Here and in the following, you may use any standard theorem about integration.]

Suppose the $X_{i}$ are strictly positive, and let $N$ be the renewal process with interarrival times $\left(X_{i}: i=1,2, \ldots\right)$. Prove that $m(t)=\mathbb{E}\left(N_{t}\right)$ satisfies the elementary renewal theorem:

$\frac{1}{t} m(t) \rightarrow \frac{1}{\mu} \quad \text { as } t \rightarrow \infty .$

A computer keyboard contains 100 different keys, including the lower and upper case letters, the usual symbols, and the space bar. A monkey taps the keys uniformly at random. Find the mean number of keys tapped until the first appearance of the sequence 'lava' as a sequence of 4 consecutive characters.

Find the mean number of keys tapped until the first appearance of the sequence 'aa' as a sequence of 2 consecutive characters.

comment

• # Paper 1, Section II, $27 \mathrm{~K}$

(a) Define a continuous time Markov chain $X$ with infinitesimal generator $Q$ and jump chain $Y$.

(b) Let $i$ be a transient state of a continuous-time Markov chain $X$ with $X(0)=i$. Show that the time spent in state $i$ has an exponential distribution and explicitly state its parameter.

[You may use the fact that if $S \sim \operatorname{Exp}(\lambda)$, then $\mathbb{E}\left[e^{\theta S}\right]=\lambda /(\lambda-\theta)$ for $\theta<\lambda$.]

(c) Let $X$ be an asymmetric random walk in continuous time on the non-negative integers with reflection at 0 , so that

$q_{i, j}= \begin{cases}\lambda & \text { if } \quad j=i+1, i \geqslant 0 \\ \mu & \text { if } \quad j=i-1, i \geqslant 1\end{cases}$

Suppose that $X(0)=0$ and $\lambda>\mu$. Show that for all $r \geqslant 1$, the total time $T_{r}$ spent in state $r$ is exponentially distributed with parameter $\lambda-\mu$.

Assume now that $X(0)$ has some general distribution with probability generating function $G$. Find the expected amount of time spent at 0 in terms of $G$.

comment
• # Paper 2, Section II, K

(a) Give the definition of a Poisson process on $\mathbb{R}_{+}$. Let $X$ be a Poisson process on $\mathbb{R}_{+}$. Show that conditional on $\left\{X_{t}=n\right\}$, the jump times $J_{1}, \ldots, J_{n}$ have joint density function

$f\left(t_{1}, \ldots, t_{n}\right)=\frac{n !}{t^{n}} \mathbf{1}\left(0 \leqslant t_{1} \leqslant \ldots \leqslant t_{n} \leqslant t\right)$

where $\boldsymbol{I}(A)$ is the indicator of the set $A$.

(b) Let $N$ be a Poisson process on $\mathbb{R}_{+}$with intensity $\lambda$ and jump times $\left\{J_{k}\right\}$. If $g: \mathbb{R}_{+} \rightarrow \mathbb{R}$ is a real function, we define for all $t>0$

$\mathcal{R}(g)[0, t]=\left\{g\left(J_{k}\right): k \in \mathbb{N}, J_{k} \leqslant t\right\}$

Show that for all $t>0$ the following is true

$\mathbb{P}(0 \in \mathcal{R}(g)[0, t])=1-\exp \left(-\lambda \int_{0}^{t} \mathbf{I}(g(s)=0) d s\right)$

comment
• # Paper 3, Section II, K

(a) Define the Moran model and Kingman's $n$-coalescent. Define Kingman's infinite coalescent.

Show that Kingman's infinite coalescent comes down from infinity. In other words, with probability one, the number of blocks of $\Pi_{t}$ is finite at any time $t>0$.

(b) Give the definition of a renewal process.

Let $\left(X_{i}\right)$ denote the sequence of inter-arrival times of the renewal process $N$. Suppose that $\mathbb{E}\left[X_{1}\right]>0$.

Prove that $\mathbb{P}(N(t) \rightarrow \infty$ as $t \rightarrow \infty)=1$.

Prove that $\mathbb{E}\left[e^{\theta N(t)}\right]<\infty$ for some strictly positive $\theta$.

[Hint: Consider the renewal process with inter-arrival times $X_{k}^{\prime}=\varepsilon \mathbf{1}\left(X_{k} \geqslant \varepsilon\right)$ for some suitable $\varepsilon>0$.]

comment
• # Paper 4, Section II, $26 K$

(a) Give the definition of an $M / M / 1$ queue. Prove that if $\lambda$ is the arrival rate and $\mu$ the service rate and $\lambda<\mu$, then the length of the queue is a positive recurrent Markov chain. What is the equilibrium distribution?

If the queue is in equilibrium and a customer arrives at some time $t$, what is the distribution of the waiting time (time spent waiting in the queue plus service time)?

(b) We now modify the above queue: on completion of service a customer leaves with probability $\delta$, or goes to the back of the queue with probability $1-\delta$. Find the distribution of the total time a customer spends being served.

Hence show that equilibrium is possible if $\lambda<\delta \mu$ and find the stationary distribution.

Show that, in equilibrium, the departure process is Poisson.

[You may use relevant theorems provided you state them clearly.]

comment

• # Paper 1, Section II, J

(a) Define a continuous-time Markov chain $X$ with infinitesimal generator $Q$ and jump chain $Y$.

(b) Prove that if a state $x$ is transient for $Y$, then it is transient for $X$.

(c) Prove or provide a counterexample to the following: if $x$ is positive recurrent for $X$, then it is positive recurrent for $Y$.

(d) Consider the continuous-time Markov chain $\left(X_{t}\right)_{t \geqslant 0}$ on $\mathbb{Z}$ with non-zero transition rates given by

$q(i, i+1)=2 \cdot 3^{|i|}, \quad q(i, i)=-3^{|i|+1} \quad \text { and } \quad q(i, i-1)=3^{|i|}$

Determine whether $X$ is transient or recurrent. Let $T_{0}=\inf \left\{t \geqslant J_{1}: X_{t}=0\right\}$, where $J_{1}$ is the first jump time. Does $X$ have an invariant distribution? Justify your answer. Calculate $\mathbb{E}_{0}\left[T_{0}\right]$.

(e) Let $X$ be a continuous-time random walk on $\mathbb{Z}^{d}$ with $q(x)=\|x\|^{\alpha} \wedge 1$ and $q(x, y)=q(x) /(2 d)$ for all $y \in \mathbb{Z}^{d}$ with $\|y-x\|=1$. Determine for which values of $\alpha$ the walk is transient and for which it is recurrent. In the recurrent case, determine the range of $\alpha$ for which it is also positive recurrent. [Here $\|x\|$ denotes the Euclidean norm of $x$.]

comment
• # Paper 2, Section II, J

(a) Define an $M / M / \infty$ queue and write without proof its stationary distribution. State Burke's theorem for an $M / M / \infty$ queue.

(b) Let $X$ be an $M / M / \infty$ queue with arrival rate $\lambda$ and service rate $\mu$ started from the stationary distribution. For each $t$, denote by $A_{1}(t)$ the last time before $t$ that a customer departed the queue and $A_{2}(t)$ the first time after $t$ that a customer departed the queue. If there is no arrival before time $t$, then we set $A_{1}(t)=0$. What is the limit as $t \rightarrow \infty$ of $\mathbb{E}\left[A_{2}(t)-A_{1}(t)\right]$ ? Explain.

(c) Consider a system of $N$ queues serving a finite number $K$ of customers in the following way: at station $1 \leqslant i \leqslant N$, customers are served immediately and the service times are independent exponentially distributed with parameter $\mu_{i}$; after service, each customer goes to station $j$ with probability $p_{i j}>0$. We assume here that the system is closed, i.e., $\sum_{j} p_{i j}=1$ for all $1 \leqslant i \leqslant N$.

Let $S=\left\{\left(n_{1}, \ldots, n_{N}\right): n_{i} \in \mathbb{N}, \sum_{i=1}^{N} n_{i}=K\right\}$ be the state space of the Markov chain. Write down its $Q$-matrix. Also write down the $Q$-matrix $R$ corresponding to the position in the network of one customer (that is, when $K=1$ ). Show that there is a unique distribution $\left(\lambda_{i}\right)_{1 \leqslant i \leqslant N}$ such that $\lambda R=0$. Show that

$\pi(n)=C_{N} \prod_{i=1}^{N} \frac{\lambda_{i}^{n_{i}}}{n_{i} !}, \quad n=\left(n_{1}, \ldots, n_{N}\right) \in S$

defines an invariant measure for the chain. Are the queue lengths independent at equilibrium?

comment
• # Paper 3, Section II, J

(a) State the thinning and superposition properties of a Poisson process on $\mathbb{R}_{+}$. Prove the superposition property.

(b) A bi-infinite Poisson process $\left(N_{t}: t \in \mathbb{R}\right)$ with $N_{0}=0$ is a process with independent and stationary increments over $\mathbb{R}$. Moreover, for all $-\infty, the increment $N_{t}-N_{s}$ has the Poisson distribution with parameter $\lambda(t-s)$. Prove that such a process exists.

(c) Let $N$ be a bi-infinite Poisson process on $\mathbb{R}$ of intensity $\lambda$. We identify it with the set of points $\left(S_{n}\right)$ of discontinuity of $N$, i.e., $N[s, t]=\sum_{n} \mathbf{l}\left(S_{n} \in[s, t]\right)$. Show that if we shift all the points of $N$ by the same constant $c$, then the resulting process is also a Poisson process of intensity $\lambda$.

Now suppose we shift every point of $N$ by $+1$ or $-1$ with equal probability. Show that the final collection of points is still a Poisson process of intensity $\lambda$. [You may assume the thinning and superposition properties for the bi-infinite Poisson process.]

comment
• # Paper 4, Section II, J

(a) Give the definition of a renewal process. Let $\left(N_{t}\right)_{t \geqslant 0}$ be a renewal process associated with $\left(\xi_{i}\right)$ with $\mathbb{E} \xi_{1}=1 / \lambda<\infty$. Show that almost surely

$\frac{N_{t}}{t} \rightarrow \lambda \quad \text { as } t \rightarrow \infty$

(b) Give the definition of Kingman's $n$-coalescent. Let $\tau$ be the first time that all blocks have coalesced. Find an expression for $\mathbb{E} e^{-q \tau}$. Let $L_{n}$ be the total length of the branches of the tree, i.e., if $\tau_{i}$ is the first time there are $i$ lineages, then $L_{n}=$ $\sum_{i=2}^{n} i\left(\tau_{i-1}-\tau_{i}\right)$. Show that $\mathbb{E} L_{n} \sim 2 \log n$ as $n \rightarrow \infty$. Show also that $\operatorname{Var}\left(L_{n}\right) \leqslant C$ for all $n$, where $C$ is a positive constant, and that in probability

$\frac{L_{n}}{\mathbb{E} L_{n}} \rightarrow 1 \quad \text { as } n \rightarrow \infty$

comment

• # Paper 1, Section II, K

(a) Give the definition of a birth and death chain in terms of its generator. Show that a measure $\pi$ is invariant for a birth and death chain if and only if it solves the detailed balance equations.

(b) There are $s$ servers in a post office and a single queue. Customers arrive as a Poisson process of rate $\lambda$ and the service times at each server are independent and exponentially distributed with parameter $\mu$. Let $X_{t}$ denote the number of customers in the post office at time $t$. Find conditions on $\lambda, \mu$ and $s$ for $X$ to be positive recurrent, null recurrent and transient, justifying your answers.

comment
• # Paper 2, Section II, $24 K$

(i) Defne a Poisson process on $\mathbb{R}_{+}$with rate $\lambda$. Let $N$ and $M$ be two independent Poisson processes on $\mathbb{R}_{+}$of rates $\lambda$ and $\mu$ respectively. Prove that $N+M$ is also a Poisson process and find its rate.

(ii) Let $X$ be a discrete time Markov chain with transition matrix $K$ on the finite state space $S$. Find the generator of the continuous time Markov chain $Y_{t}=X_{N_{t}}$ in terms of $K$ and $\lambda$. Show that if $\pi$ is an invariant distribution for $X$, then it is also invariant for $Y$.

Suppose that $X$ has an absorbing state $a$. If $\tau_{a}$ and $T_{a}$ are the absorption times for $X$ and $Y$ respectively, write an equation that relates $\mathbb{E}_{x}\left[\tau_{a}\right]$ and $\mathbb{E}_{x}\left[T_{a}\right]$, where $x \in S$.

[Hint: You may want to prove that if $\xi_{1}, \xi_{2}, \ldots$ are i.i.d. non-negative random variables with $\mathbb{E}\left[\xi_{1}\right]<\infty$ and $M$ is an independent non-negative random variable, then $\left.\mathbb{E}\left[\sum_{i=1}^{M} \xi_{i}\right]=\mathbb{E}[M] \mathbb{E}\left[\xi_{1}\right] .\right]$

comment
• # Paper 3, Section II, K

(i) Let $X$ be a Poisson process of parameter $\lambda$. Let $Y$ be obtained by taking each point of $X$ and, independently of the other points, keeping it with probability $p$. Show that $Y$ is another Poisson process and find its intensity. Show that for every fixed $t$ the random variables $Y_{t}$ and $X_{t}-Y_{t}$ are independent.

(ii) Suppose we have $n$ bins, and balls arrive according to a Poisson process of rate 1 . Upon arrival we choose a bin uniformly at random and place the ball in it. We let $M_{n}$ be the maximum number of balls in any bin at time $n$. Show that

$\mathbb{P}\left(M_{n} \geqslant(1+\epsilon) \frac{\log n}{\log \log n}\right) \rightarrow 0 \quad \text { as } n \rightarrow \infty$

[You may use the fact that if $\xi$ is a Poisson random variable of mean 1 , then

$\mathbb{P}(\xi \geqslant x) \leqslant \exp (x-x \log x) .]$

comment
• # Paper 4, Section II, K

(i) Let $X$ be a Markov chain on $S$ and $A \subset S$. Let $T_{A}$ be the hitting time of $A$ and $\tau_{y}$ denote the total time spent at $y \in S$ by the chain before hitting $A$. Show that if $h(x)=\mathbb{P}_{x}\left(T_{A}<\infty\right)$, then $\mathbb{E}_{x}\left[\tau_{y} \mid T_{A}<\infty\right]=[h(y) / h(x)] \mathbb{E}_{x}\left(\tau_{y}\right) .$

(ii) Define the Moran model and show that if $X_{t}$ is the number of individuals carrying allele $a$ at time $t \geqslant 0$ and $\tau$ is the fixation time of allele $a$, then

$\mathbb{P}\left(X_{\tau}=N \mid X_{0}=i\right)=\frac{i}{N}$

Show that conditionally on fixation of an allele $a$ being present initially in $i$ individuals,

$\mathbb{E}[\tau \mid \text { fixation }]=N-i+\frac{N-i}{i} \sum_{j=1}^{i-1} \frac{j}{N-j}$

comment

• # Paper 1, Section II, J

(i) Explain what a $Q$-matrix is. Let $Q$ be a $Q$-matrix. Define the notion of a Markov chain $\left(X_{t}, t \geqslant 0\right)$ in continuous time with $Q$-matrix given by $Q$, and give a construction of $\left(X_{t}, t \geqslant 0\right)$. [You are not required to justify this construction.]

(ii) A population consists of $N_{t}$ individuals at time $t \geqslant 0$. We assume that each individual gives birth to a new individual at constant rate $\lambda>0$. As the population is competing for resources, we assume that for each $n \geqslant 1$, if $N_{t}=n$, then any individual in the population at time $t$ dies in the time interval $[t, t+h)$ with probability $\delta_{n} h+o(h)$, where $\left(\delta_{n}\right)_{n=1}^{\infty}$ is a given sequence satisfying $\delta_{1}=0, \delta_{n}>0$ for $n \geqslant 2$. Formulate a Markov chain model for $\left(N_{t}, t \geqslant 0\right)$ and write down the $Q$-matrix explicitly. Then find a necessary and sufficient condition on $\left(\delta_{n}\right)_{n=1}^{\infty}$ so that the Markov chain has an invariant distribution. Compute the invariant distribution in the case where $\delta_{n}=\mu(n-1)$ and $\mu>0$.

comment
• # Paper 2, Section II, J

(i) Explain what the Moran model and the infinite alleles model are. State Ewens' sampling formula for the distribution of the allelic frequency spectrum $\left(a_{1}, \ldots, a_{n}\right)$ in terms of $\theta$ where $\theta=N u$ with $u$ denoting the mutation rate per individual and $N$ the population size.

Let $K_{n}$ be the number of allelic types in a sample of size $n$. Give, without justification, an expression for $\mathbb{E}\left(K_{n}\right)$ in terms of $\theta$.

(ii) Let $K_{n}$ and $\theta$ be as above. Show that for $1 \leqslant k \leqslant n$ we have that

$P\left(K_{n}=k\right)=C \frac{\theta^{k}}{\theta(\theta+1) \cdots(\theta+n-1)}$

for some constant $C$ that does not depend on $\theta$.

Show that, given $\left\{K_{n}=k\right\}$, the distribution of the allelic frequency spectrum $\left(a_{1}, \ldots, a_{n}\right)$ does not depend on $\theta$.

Show that the value of $\theta$ which maximises $\mathbb{P}\left(K_{n}=k\right)$ is the one for which $k=\mathbb{E}\left(K_{n}\right)$.

comment
• # Paper 3, Section II, J

(i) Define a Poisson process $\left(N_{t}, t \geqslant 0\right)$ with intensity $\lambda$. Specify without justification the distribution of $N_{t}$. Let $T_{1}, T_{2}, \ldots$ denote the jump times of $\left(N_{t}, t \geqslant 0\right)$. Derive the joint distribution of $\left(T_{1}, \ldots, T_{n}\right)$ given $\left\{N_{t}=n\right\}$.

(ii) Let $\left(N_{t}, t \geqslant 0\right)$ be a Poisson process with intensity $\lambda>0$ and let $X_{1}, X_{2}, \ldots$ be a sequence of i.i.d. random variables, independent of $\left(N_{t}, t \geqslant 0\right)$, all having the same distribution as a random variable $X$. Show that if $g(s, x)$ is a real-valued function of real variables $s, x$, and $T_{j}$ are the jump times of $\left(N_{t}, t \geqslant 0\right)$ then

$\mathbb{E}\left[\exp \left\{\theta \sum_{j=1}^{N_{t}} g\left(T_{j}, X_{j}\right)\right\}\right]=\exp \left\{\lambda \int_{0}^{t}\left(\mathbb{E}\left(e^{\theta g(s, X)}\right)-1\right) d s\right\}$

for all $\theta \in \mathbb{R}$. [Hint: Condition on $\left\{N_{t}=n\right\}$ and $T_{1}, \ldots, T_{n}$, using (i).]

(iii) A university library is open from 9 am to $5 \mathrm{pm}$. Students arrive at times of a Poisson process with intensity $\lambda$. Each student spends a random amount of time in the library, independently of the other students. These times are identically distributed for all students and have the same distribution as a random variable $X$. Show that the number of students in the library at $5 \mathrm{pm}$ is a Poisson random variable with a mean that you should specify.

comment
• # Paper 4, Section II, J

(i) Define the $M / M / 1$ queue with arrival rate $\lambda$ and service rate $\mu$. Find conditions on the parameters $\lambda$ and $\mu$ for the queue to be transient, null recurrent, and positive recurrent, briefly justifying your answers. In the last case give with justification the invariant distribution explicitly. Answer the same questions for an $M / M / \infty$ queue.

(ii) At a taxi station, customers arrive at a rate of 3 per minute, and taxis at a rate of 2 per minute. Suppose that a taxi will wait no matter how many other taxis are present. However, if a person arriving does not find a taxi waiting he or she leaves to find alternative transportation.

Find the long-run proportion of arriving customers who get taxis, and find the average number of taxis waiting in the long run.

An agent helps to assign customers to taxis, and so long as there are taxis waiting he is unable to have his coffee. Once a taxi arrives, how long will it take on average before he can have another sip of his coffee?

comment

• # Paper 1, Section II, J

Let $\left(X_{t}, t \geqslant 0\right)$ be a Markov chain on $\{0,1, \ldots\}$ with $Q$-matrix given by

\begin{aligned} q_{n, n+1} &=\lambda_{n} \\ q_{n, 0} &=\lambda_{n} \varepsilon_{n} \quad(n>0) \\ q_{n, m} &=0 \quad \text { if } m \notin\{0, n, n+1\} \end{aligned}

where $\varepsilon_{n}, \lambda_{n}>0$.

(i) Show that $X$ is transient if and only if $\sum_{n} \varepsilon_{n}<\infty$. [You may assume without proof that $x(1-\delta) \leqslant \log (1+x) \leqslant x$ for all $\delta>0$ and all sufficiently small positive $x$.]

(ii) Assume that $\sum_{n} \varepsilon_{n}<\infty$. Find a necessary and sufficient condition for $X$ to be almost surely explosive. [You may assume without proof standard results about pure birth processes, provided that they are stated clearly.]

(iii) Find a stationary measure for $X$. For the case $\lambda_{n}=\lambda$ and $\varepsilon_{n}=\alpha /(n+1)(\lambda, \alpha>0)$, show that $X$ is positive recurrent if and only if $\alpha>1$.

comment
• # Paper 2, Section II, J

(i) Define a Poisson process as a Markov chain on the non-negative integers and state three other characterisations.

(ii) Let $\lambda(s)(s \geqslant 0)$ be a continuous positive function. Let $\left(X_{t}, t \geqslant 0\right)$ be a right-continuous process with independent increments, such that

\begin{aligned} \mathbb{P}\left(X_{t+h}=X_{t}+1\right) &=\lambda(t) h+o(h) \\ \mathbb{P}\left(X_{t+h}=X_{t}\right) &=1-\lambda(t) h+o(h) \end{aligned}

where the $o(h)$ terms are uniform in $t \in[0, \infty)$. Show that $X_{t}$ is a Poisson random variable with parameter $\Lambda(t)=\int_{0}^{t} \lambda(s) d s$.

(iii) Let $X=\left(X_{n}: n=1,2, \ldots\right)$ be a sequence of independent and identically distributed positive random variables with continuous density function $f$. We define the sequence of successive records, $\left(K_{n}, n=0,1, \ldots\right)$, by $K_{0}:=0$ and, for $n \geqslant 0$,

$K_{n+1}:=\inf \left\{m>K_{n}: X_{m}>X_{K_{n}}\right\}$

The record process,$\left(R_{t}, t \geqslant 0\right)$, is then defined by

$R_{t}:=\#\left\{n \geqslant 1: X_{K_{n}} \leqslant t\right\}$

Explain why the increments of $R$ are independent. Show that $R_{t}$ is a Poisson random variable with parameter $-\log \{1-F(t)\}$ where $F(t)=\int_{0}^{t} f(s) d s$.

[You may assume the following without proof: For fixed $t>0$, let $Y$ (respectively, $Z$ ) be the subsequence of $X$ obtained by retaining only those elements that are greater than (respectively, smaller than) $t$. Then $Y$ (respectively, $Z$ ) is a sequence of independent variables each having the distribution of $X_{1}$ conditioned on $X_{1}>t$ (respectively, $X_{1} ); and $Y$ and $Z$ are independent.]

comment
• # Paper 3, Section II, J

Define the Moran model. Describe briefly the infinite sites model of mutations.

We henceforth consider a population with $N$ individuals evolving according to the rules of the Moran model. In addition we assume:

• the allelic type of any individual at any time lies in a given countable state space $S$;

• individuals are subject to mutations at constant rate $u=\theta / N$, independently of the population dynamics;

• each time a mutation occurs, if the allelic type of the individual was $x \in S$, it changes to $y \in S$ with probability $P(x, y)$, where $P(x, y)$ is a given Markovian transition matrix on $S$ that is symmetric:

$P(x, y)=P(y, x) \quad(x, y \in S)$

(i) Show that, if two individuals are sampled at random from the population at some time $t$, then the time to their most recent common ancestor has an exponential distribution, with a parameter that you should specify.

(ii) Let $\Delta+1$ be the total number of mutations that accumulate on the two branches separating these individuals from their most recent common ancestor. Show that $\Delta+1$ is a geometric random variable, and specify its probability parameter $p$.

(iii) The first individual is observed to be of type $x \in S$. Explain why the probability that the second individual is also of type $x$ is

$\mathbb{P}\left(X_{\Delta}=x \mid X_{0}=x\right),$

where $\left(X_{n}, n \geqslant 0\right)$ is a Markov chain on $S$ with transition matrix $P$ and is independent of $\Delta$.

comment
• # Paper 4, Section II, J

(i) Define an $M / M / 1$ queue. Justifying briefly your answer, specify when this queue has a stationary distribution, and identify that distribution. State and prove Burke's theorem for this queue.

(ii) Let $\left(L_{1}(t), \ldots, L_{N}(t), t \geqslant 0\right)$ denote a Jackson network of $N$ queues, where the entrance and service rates for queue $i$ are respectively $\lambda_{i}$ and $\mu_{i}$, and each customer leaving queue $i$ moves to queue $j$ with probability $p_{i j}$ after service. We assume $\sum_{j} p_{i j}<1$ for each $i=1, \ldots, N$; with probability $1-\sum_{j} p_{i j}$ a customer leaving queue $i$ departs from the system. State Jackson's theorem for this network. [You are not required to prove it.] Are the processes $\left(L_{1}(t), \ldots, L_{N}(t), t \geqslant 0\right)$ independent at equilibrium? Justify your answer.

(iii) Let $D_{i}(t)$ be the process of final departures from queue $i$. Show that, at equilibrium, $\left(L_{1}(t), \ldots, L_{N}(t)\right)$ is independent of $\left(D_{i}(s), 1 \leqslant i \leqslant N, 0 \leqslant s \leqslant t\right)$. Show that, for each fixed $i=1, \ldots, N,\left(D_{i}(t), t \geqslant 0\right)$ is a Poisson process, and specify its rate.

comment

• # Paper 1, Section II, $27 \mathrm{~K}$

(a) Give the definition of a Poisson process $\left(N_{t}, t \geqslant 0\right)$ with rate $\lambda$, using its transition rates. Show that for each $t \geqslant 0$, the distribution of $N_{t}$ is Poisson with a parameter to be specified.

Let $J_{0}=0$ and let $J_{1}, J_{2}, \ldots$ denote the jump times of $\left(N_{t}, t \geqslant 0\right)$. What is the distribution of $\left(J_{n+1}-J_{n}, n \geqslant 0\right) ?$ (You do not need to justify your answer.)

(b) Let $n \geqslant 1$. Compute the joint probability density function of $\left(J_{1}, J_{2}, \ldots, J_{n}\right)$ given $\left\{N_{t}=n\right\}$. Deduce that, given $\left\{N_{t}=n\right\},\left(J_{1}, \ldots, J_{n}\right)$ has the same distribution as the nondecreasing rearrangement of $n$ independent uniform random variables on $[0, t]$.

(c) Starting from time 0, passengers arrive on platform $9 \mathrm{~B}$ at King's Cross station, with constant rate $\lambda>0$, in order to catch a train due to depart at time $t>0$. Using the above results, or otherwise, find the expected total time waited by all passengers (the sum of all passengers' waiting times).

comment
• # Paper 2, Section II, K

(a) A colony of bacteria evolves as follows. Let $X$ be a random variable with values in the positive integers. Each bacterium splits into $X$ copies of itself after an exponentially distributed time of parameter $\lambda>0$. Each of the $X$ daughters then splits in the same way but independently of everything else. This process keeps going forever. Let $Z_{t}$ denote the number of bacteria at time $t$. Specify the $Q$-matrix of the Markov chain $Z=\left(Z_{t}, t \geqslant 0\right)$. [It will be helpful to introduce $p_{n}=\mathbb{P}(X=n)$, and you may assume for simplicity that $\left.p_{0}=p_{1}=0 .\right]$

(b) Using the Kolmogorov forward equation, or otherwise, show that if $u(t)=$ $\mathbb{E}\left(Z_{t} \mid Z_{0}=1\right)$, then $u^{\prime}(t)=\alpha u(t)$ for some $\alpha$ to be explicitly determined in terms of $X$. Assuming that $\mathbb{E}(X)<\infty$, deduce the value of $u(t)$ for all $t \geqslant 0$, and show that $Z$ does not explode. [You may differentiate series term by term and exchange the order of summation without justification.]

(c) We now assume that $X=2$ with probability 1 . Fix $0 and let $\phi(t)=\mathbb{E}\left(q^{Z_{t}} \mid Z_{0}=1\right)$. Show that $\phi$ satisfies

$\phi(t)=q e^{-\lambda t}+\int_{0}^{t} \lambda e^{-\lambda s} \phi(t-s)^{2} d s$

By making the change of variables $u=t-s$, show that $d \phi / d t=\lambda \phi(\phi-1)$. Deduce that for all $n \geqslant 1, \mathbb{P}\left(Z_{t}=n \mid Z_{0}=1\right)=\beta^{n-1}(1-\beta)$ where $\beta=1-e^{-\lambda t}$.

comment
• # Paper 3, Section II, K

We consider a system of two queues in tandem, as follows. Customers arrive in the first queue at rate $\lambda$. Each arriving customer is immediately served by one of infinitely many servers at rate $\mu_{1}$. Immediately after service, customers join a single-server second queue which operates on a first-come, first-served basis, and has a service rate $\mu_{2}$. After service in this second queue, each customer returns to the first queue with probability $0<1-p<1$, and otherwise leaves the system forever. A schematic representation is given below:

(a) Let $M_{t}$ and $N_{t}$ denote the number of customers at time $t$ in queues number 1 and 2 respectively, including those currently in service at time $t$. Give the transition rates of the Markov chain $\left(M_{t}, N_{t}\right)_{t \geqslant 0}$.

(b) Write down an equation satisfied by any invariant measure $\pi$ for this Markov chain. Let $\alpha>0$ and $\beta \in(0,1)$. Define a measure $\pi$ by

$\pi(m, n):=e^{-\alpha} \frac{\alpha^{m}}{m !} \beta^{n}(1-\beta), \quad m, n \in\{0,1, \ldots\}$

Show that it is possible to find $\alpha>0, \beta \in(0,1)$ so that $\pi$ is an invariant measure of $\left(M_{t}, N_{t}\right)_{t \geqslant 0}$, if and only if $\lambda<\mu_{2} p$. Give the values of $\alpha$ and $\beta$ in this case.

(c) Assume now that $\lambda p>\mu_{2}$. Show that the number of customers is not positive recurrent.

[Hint. One way to solve the problem is as follows. Assume it is positive recurrent. Observe that $M_{t}$ is greater than a $M / M / \infty$ queue with arrival rate $\lambda$. Deduce that $N_{t}$ is greater than a $M / M / 1$ queue with arrival rate $\lambda p$ and service rate $\mu_{2}$. You may use without proof the fact that the departure process from the first queue is then, at equilibrium, a Poisson process with rate $\lambda$, and you may use without proof properties of thinned Poisson processes.]

comment
• # Paper 4, Section II, $26 K$

(a) Define the Moran model and Kingman's $n$-coalescent. State and prove a theorem which describes the relationship between them. [You may use without proof a construction of the Moran model for all $-\infty.]

(b) Let $\theta>0$. Suppose that a population of $N \geqslant 2$ individuals evolves according to the rules of the Moran model. Assume also that each individual in the population undergoes a mutation at constant rate $u=\theta /(N-1)$. Each time a mutation occurs, we assume that the allelic type of the corresponding individual changes to an entirely new type, never seen before in the population. Let $p(\theta)$ be the homozygosity probability, i.e., the probability that two individuals sampled without replacement from the population have the same genetic type. Give an expression for $p(\theta)$.

(c) Let $q(\theta)$ denote the probability that a sample of size $n$ consists of one allelic type (monomorphic population). Show that $q(\theta)=\mathbb{E}\left(\exp \left\{-(\theta / 2) L_{n}\right\}\right)$, where $L_{n}$ denotes the sum of all the branch lengths in the genealogical tree of the sample - that is, $L_{n}=\sum_{i=2}^{n} i\left(\tau_{i}-\tau_{i-1}\right)$, where $\tau_{i}$