Part IA, 2013, Paper 2

# Part IA, 2013, Paper 2

### Jump to course

Paper 2, Section I, A

commentUse the transformation $z=\ln x$ to solve

$\ddot{z}=-\dot{z}^{2}-1-e^{-z}$

subject to the conditions $z=0$ and $\dot{z}=V$ at $t=0$, where $V$ is a positive constant.

Show that when $\dot{z}(t)=0$

$z=\ln \left(\sqrt{V^{2}+4}-1\right)$

Paper 2, Section I, A

commentSolve the equation

$\ddot{y}-\dot{y}-2 y=3 e^{2 t}+3 e^{-t}+3+6 t$

subject to the conditions $y=\dot{y}=0$ at $t=0$.

Paper 2, Section II, $6 \mathrm{~A}$

commentConsider the function

$f(x, y)=\left(x^{2}-y^{4}\right)\left(1-x^{2}-y^{4}\right)$

Determine the type of each of the nine critical points.

Sketch contours of constant $f(x, y)$.

Paper 2, Section II, A

commentThe function $y(x)$ satisfies the equation

$y^{\prime \prime}+p(x) y^{\prime}+q(x) y=0 .$

Give the definitions of the terms ordinary point, singular point, and regular singular point for this equation.

For the equation

$x y^{\prime \prime}+y=0$

classify the point $x=0$ according to your definitions. Find the series solution about $x=0$ which satisfies

$y=0 \quad \text { and } \quad y^{\prime}=1 \quad \text { at } x=0$

For a second solution with $y=1$ at $x=0$, consider an expansion

$y(x)=y_{0}(x)+y_{1}(x)+y_{2}(x)+\ldots,$

where $y_{0}=1$ and $x y_{n+1}^{\prime \prime}=-y_{n}$. Find $y_{1}$ and $y_{2}$ which have $y_{n}(0)=0$ and $y_{n}^{\prime}(1)=0$. Comment on $y^{\prime}$ near $x=0$ for this second solution.

Paper 2, Section II, A

commentMedical equipment is sterilised by placing it in a hot oven for a time $T$ and then removing it and letting it cool for the same time. The equipment at temperature $\theta(t)$ warms and cools at a rate equal to the product of a constant $\alpha$ and the difference between its temperature and its surroundings, $\theta_{1}$ when warming in the oven and $\theta_{0}$ when cooling outside. The equipment starts the sterilisation process at temperature $\theta_{0}$.

Bacteria are killed by the heat treatment. Their number $N(t)$ decreases at a rate equal to the product of the current number and a destruction factor $\beta$. This destruction factor varies linearly with temperature, vanishing at $\theta_{0}$ and having a maximum $\beta_{\max }$ at $\theta_{1}$.

Find an implicit equation for $T$ such that the number of bacteria is reduced by a factor of $10^{-20}$ by the sterilisation process.

A second hardier species of bacteria requires the oven temperature to be increased to achieve the same destruction factor $\beta_{\max }$. How is the sterilisation time $T$ affected?

Paper 2, Section II, A

commentFind $x(t)$ and $y(t)$ which satisfy

$\begin{aligned} &3 \dot{x}+\dot{y}+5 x-y=2 e^{-t}+4 e^{-3 t} \\ &\dot{x}+4 \dot{y}-2 x+7 y=-3 e^{-t}+5 e^{-3 t} \end{aligned}$

subject to $x=y=0$ at $t=0$.

Paper 2, Section I, F

comment(i) Let $X$ be a random variable. Use Markov's inequality to show that

$\mathbb{P}(X \geqslant k) \leqslant \mathbb{E}\left(e^{t X}\right) e^{-k t}$

for all $t \geqslant 0$ and real $k$.

(ii) Calculate $\mathbb{E}\left(e^{t X}\right)$ in the case where $X$ is a Poisson random variable with parameter $\lambda=1$. Using the inequality from part (i) with a suitable choice of $t$, prove that

$\frac{1}{k !}+\frac{1}{(k+1) !}+\frac{1}{(k+2) !}+\ldots \leqslant\left(\frac{e}{k}\right)^{k}$

for all $k \geqslant 1$.

Paper 2, Section I, F

commentLet $X$ be a random variable with mean $\mu$ and variance $\sigma^{2}$. Let

$G(a)=\mathbb{E}\left[(X-a)^{2}\right]$

Show that $G(a) \geqslant \sigma^{2}$ for all $a$. For what value of $a$ is there equality?

Let

$H(a)=\mathbb{E}[|X-a|]$

Supposing that $X$ has probability density function $f$, express $H(a)$ in terms of $f$. Show that $H$ is minimised when $a$ is such that $\int_{-\infty}^{a} f(x) d x=1 / 2$.

Paper 2, Section II, F

commentLet $\Omega$ be the sample space of a probabilistic experiment, and suppose that the sets $B_{1}, B_{2}, \ldots, B_{k}$ are a partition of $\Omega$ into events of positive probability. Show that

$\mathbb{P}\left(B_{i} \mid A\right)=\frac{\mathbb{P}\left(A \mid B_{i}\right) \mathbb{P}\left(B_{i}\right)}{\sum_{j=1}^{k} \mathbb{P}\left(A \mid B_{j}\right) \mathbb{P}\left(B_{j}\right)}$

for any event $A$ of positive probability.

A drawer contains two coins. One is an unbiased coin, which when tossed, is equally likely to turn up heads or tails. The other is a biased coin, which will turn up heads with probability $p$ and tails with probability $1-p$. One coin is selected (uniformly) at random from the drawer. Two experiments are performed:

(a) The selected coin is tossed $n$ times. Given that the coin turns up heads $k$ times and tails $n-k$ times, what is the probability that the coin is biased?

(b) The selected coin is tossed repeatedly until it turns up heads $k$ times. Given that the coin is tossed $n$ times in total, what is the probability that the coin is biased?

Paper 2, Section II, F

commentLet $X$ be a geometric random variable with $\mathbb{P}(X=1)=p$. Derive formulae for $\mathbb{E}(X)$ and $\operatorname{Var}(X)$ in terms of $p .$

A jar contains $n$ balls. Initially, all of the balls are red. Every minute, a ball is drawn at random from the jar, and then replaced with a green ball. Let $T$ be the number of minutes until the jar contains only green balls. Show that the expected value of $T$ is $n \sum_{i=1}^{n} 1 / i$. What is the variance of $T ?$

Paper 2, Section II, F

commentLet $X$ be a random variable taking values in the non-negative integers, and let $G$ be the probability generating function of $X$. Assuming $G$ is everywhere finite, show that

$G^{\prime}(1)=\mu \text { and } G^{\prime \prime}(1)=\sigma^{2}+\mu^{2}-\mu$

where $\mu$ is the mean of $X$ and $\sigma^{2}$ is its variance. [You may interchange differentiation and expectation without justification.]

Consider a branching process where individuals produce independent random numbers of offspring with the same distribution as $X$. Let $X_{n}$ be the number of individuals in the $n$-th generation, and let $G_{n}$ be the probability generating function of $X_{n}$. Explain carefully why

$G_{n+1}(t)=G_{n}(G(t))$

Assuming $X_{0}=1$, compute the mean of $X_{n}$. Show that

$\operatorname{Var}\left(X_{n}\right)=\sigma^{2} \frac{\mu^{n-1}\left(\mu^{n}-1\right)}{\mu-1}$

Suppose $\mathbb{P}(X=0)=3 / 7$ and $\mathbb{P}(X=3)=4 / 7$. Compute the probability that the population will eventually become extinct. You may use standard results on branching processes as long as they are clearly stated.

Paper 2, Section II, F

commentLet $Z$ be an exponential random variable with parameter $\lambda=1$. Show that

$\mathbb{P}(Z>s+t \mid Z>s)=\mathbb{P}(Z>t)$

for any $s, t \geqslant 0$.

Let $Z_{\text {int }}=\lfloor Z\rfloor$ be the greatest integer less than or equal to $Z$. What is the probability mass function of $Z_{\text {int }}$ ? Show that $\mathbb{E}\left(Z_{\text {int }}\right)=\frac{1}{e-1}$.

Let $Z_{\mathrm{frac}}=Z-Z_{\mathrm{int}}$ be the fractional part of $Z$. What is the density of $Z_{\mathrm{frac}}$ ?

Show that $Z_{\text {int }}$ and $Z_{\text {frac }}$ are independent.