Part IA, 2016, Paper 2

# Part IA, 2016, Paper 2

### Jump to course

Paper 2, Section $I$, A

comment(a) For each non-negative integer $n$ and positive constant $\lambda$, let

$I_{n}(\lambda)=\int_{0}^{\infty} x^{n} e^{-\lambda x} d x$

By differentiating $I_{n}$ with respect to $\lambda$, find its value in terms of $n$ and $\lambda$.

(b) By making the change of variables $x=u+v, y=u-v$, transform the differential equation

$\frac{\partial^{2} f}{\partial x \partial y}=1$

into a differential equation for $g$, where $g(u, v)=f(x, y)$.

Paper 2, Section I, A

comment(a) Find the solution of the differential equation

$y^{\prime \prime}-y^{\prime}-6 y=0$

that is bounded as $x \rightarrow \infty$ and satisfies $y=1$ when $x=0$.

(b) Solve the difference equation

$\left(y_{n+1}-2 y_{n}+y_{n-1}\right)-\frac{h}{2}\left(y_{n+1}-y_{n-1}\right)-6 h^{2} y_{n}=0 .$

Show that if $0<h \ll 1$, the solution that is bounded as $n \rightarrow \infty$ and satisfies $y_{0}=1$ is approximately $(1-2 h)^{n}$.

(c) By setting $x=n h$, explain the relation between parts (a) and (b).

Paper 2, Section II, $6 A$

comment(a) The function $y(x)$ satisfies

$y^{\prime \prime}+p(x) y^{\prime}+q(x) y=0$

(i) Define the Wronskian $W(x)$ of two linearly independent solutions $y_{1}(x)$ and $y_{2}(x)$. Derive a linear first-order differential equation satisfied by $W(x)$.

(ii) Suppose that $y_{1}(x)$ is known. Use the Wronskian to write down a first-order differential equation for $y_{2}(x)$. Hence express $y_{2}(x)$ in terms of $y_{1}(x)$ and $W(x)$.

(b) Verify that $y_{1}(x)=\cos \left(x^{\gamma}\right)$ is a solution of

$a x^{\alpha} y^{\prime \prime}+b x^{\alpha-1} y^{\prime}+y=0,$

where $a, b, \alpha$ and $\gamma$ are constants, provided that these constants satisfy certain conditions which you should determine.

Use the method that you described in part (a) to find a solution which is linearly independent of $y_{1}(x)$.

Paper 2, Section II, A

comment(a) Find and sketch the solution of

$y^{\prime \prime}+y=\delta(x-\pi / 2)$

where $\delta$ is the Dirac delta function, subject to $y(0)=1$ and $y^{\prime}(0)=0$.

(b) A bowl of soup, which Sam has just warmed up, cools down at a rate equal to the product of a constant $k$ and the difference between its temperature $T(t)$ and the temperature $T_{0}$ of its surroundings. Initially the soup is at temperature $T(0)=\alpha T_{0}$, where $\alpha>2$.

(i) Write down and solve the differential equation satisfied by $T(t)$.

(ii) At time $t_{1}$, when the temperature reaches half of its initial value, Sam quickly adds some hot water to the soup, so the temperature increases instantaneously by $\beta$, where $\beta>\alpha T_{0} / 2$. Find $t_{1}$ and $T(t)$ for $t>t_{1}$.

(iii) Sketch $T(t)$ for $t>0$.

(iv) Sam wants the soup to be at temperature $\alpha T_{0}$ at time $t_{2}$, where $t_{2}>t_{1}$. What value of $\beta$ should Sam choose to achieve this? Give your answer in terms of $\alpha$, $k, t_{2}$ and $T_{0}$.

Paper 2, Section II, A

comment(a) By considering eigenvectors, find the general solution of the equations

$\tag{†} \begin{aligned} &\frac{d x}{d t}=2 x+5 y \\ &\frac{d y}{d t}=-x-2 y \end{aligned}$

and show that it can be written in the form

$\left(\begin{array}{l} x \\ y \end{array}\right)=\alpha\left(\begin{array}{c} 5 \cos t \\ -2 \cos t-\sin t \end{array}\right)+\beta\left(\begin{array}{c} 5 \sin t \\ \cos t-2 \sin t \end{array}\right)$

where $\alpha$ and $\beta$ are constants.

(b) For any square matrix $M$, $\exp (M)$ is defined by

$\exp (M)=\sum_{n=0}^{\infty} \frac{M^{n}}{n !}$

Show that if $M$ has constant elements, the vector equation $\frac{d \mathbf{x}}{d t}=M \mathbf{x}$ has a solution $\mathbf{x}=\exp (M t) \mathbf{x}_{0}$, where $\mathbf{x}_{0}$ is a constant vector. Hence solve $(†)$ and show that your solution is consistent with the result of part (a).

Paper 2, Section II, A

commentThe function $y(x)$ satisfies

$y^{\prime \prime}+p(x) y^{\prime}+q(x) y=0$

What does it mean to say that the point $x=0$ is (i) an ordinary point and (ii) a regular singular point of this differential equation? Explain what is meant by the indicial equation at a regular singular point. What can be said about the nature of the solutions in the neighbourhood of a regular singular point in the different cases that arise according to the values of the roots of the indicial equation?

State the nature of the point $x=0$ of the equation

$x y^{\prime \prime}+(x-m+1) y^{\prime}-(m-1) y=0$

Set $y(x)=x^{\sigma} \sum_{n=0}^{\infty} a_{n} x^{n}$, where $a_{0} \neq 0$, and find the roots of the indicial equation.

(a) Show that one solution of $(*)$ with $m \neq 0,-1,-2, \cdots$ is

$y(x)=x^{m}\left(1+\sum_{n=1}^{\infty} \frac{(-1)^{n} x^{n}}{(m+n)(m+n-1) \cdots(m+1)}\right)$

and find a linearly independent solution in the case when $m$ is not an integer.

(b) If $m$ is a positive integer, show that $(*)$ has a polynomial solution.

(c) What is the form of the general solution of $(*)$ in the case $m=0$ ? [You do not need to find the general solution explicitly.]

Paper 2, Section I, $4 \mathrm{~F}$

commentDefine the moment-generating function $m_{Z}$ of a random variable $Z$. Let $X_{1}, \ldots, X_{n}$ be independent and identically distributed random variables with distribution $\mathcal{N}(0,1)$, and let $Z=X_{1}^{2}+\cdots+X_{n}^{2}$. For $\theta<1 / 2$, show that

$m_{Z}(\theta)=(1-2 \theta)^{-n / 2} .$

Paper 2, Section I, F

commentLet $X_{1}, \ldots, X_{n}$ be independent random variables, all with uniform distribution on $[0,1]$. What is the probability of the event $\left\{X_{1}>X_{2}>\cdots>X_{n-1}>X_{n}\right\}$ ?

Paper 2, Section II, F

commentA random graph with $n$ nodes $v_{1}, \ldots, v_{n}$ is drawn by placing an edge with probability $p$ between $v_{i}$ and $v_{j}$ for all distinct $i$ and $j$, independently. A triangle is a set of three distinct nodes $v_{i}, v_{j}, v_{k}$ that are all connected: there are edges between $v_{i}$ and $v_{j}$, between $v_{j}$ and $v_{k}$ and between $v_{i}$ and $v_{k}$.

(a) Let $T$ be the number of triangles in this random graph. Compute the maximum value and the expectation of $T$.

(b) State the Markov inequality. Show that if $p=1 / n^{\alpha}$, for some $\alpha>1$, then $\mathbb{P}(T=0) \rightarrow 1$ when $n \rightarrow \infty$

(c) State the Chebyshev inequality. Show that if $p$ is such that $\operatorname{Var}[T] / \mathbb{E}[T]^{2} \rightarrow 0$ when $n \rightarrow \infty$, then $\mathbb{P}(T=0) \rightarrow 0$ when $n \rightarrow \infty$

Paper 2, Section II, F

commentLet $X$ be a non-negative random variable such that $\mathbb{E}\left[X^{2}\right]>0$ is finite, and let $\theta \in[0,1]$.

(a) Show that

$\mathbb{E}[X \mathbb{I}[\{X>\theta \mathbb{E}[X]\}]] \geqslant(1-\theta) \mathbb{E}[X]$

(b) Let $Y_{1}$ and $Y_{2}$ be random variables such that $\mathbb{E}\left[Y_{1}^{2}\right]$ and $\mathbb{E}\left[Y_{2}^{2}\right]$ are finite. State and prove the Cauchy-Schwarz inequality for these two variables.

(c) Show that

$\mathbb{P}(X>\theta \mathbb{E}[X]) \geqslant(1-\theta)^{2} \frac{\mathbb{E}[X]^{2}}{\mathbb{E}\left[X^{2}\right]}$

Paper 2, Section II, F

commentWe randomly place $n$ balls in $m$ bins independently and uniformly. For each $i$ with $1 \leqslant i \leqslant m$, let $B_{i}$ be the number of balls in bin $i$.

(a) What is the distribution of $B_{i}$ ? For $i \neq j$, are $B_{i}$ and $B_{j}$ independent?

(b) Let $E$ be the number of empty bins, $C$ the number of bins with two or more balls, and $S$ the number of bins with exactly one ball. What are the expectations of $E, C$ and $S$ ?

(c) Let $m=a n$, for an integer $a \geqslant 2$. What is $\mathbb{P}(E=0)$ ? What is the limit of $\mathbb{E}[E] / m$ when $n \rightarrow \infty$ ?

(d) Instead, let $n=d m$, for an integer $d \geqslant 2$. What is $\mathbb{P}(C=0)$ ? What is the limit of $\mathbb{E}[C] / m$ when $n \rightarrow \infty$ ?

Paper 2, Section II, F

commentFor any positive integer $n$ and positive real number $\theta$, the Gamma distribution $\Gamma(n, \theta)$ has density $f_{\Gamma}$ defined on $(0, \infty)$ by

$f_{\Gamma}(x)=\frac{\theta^{n}}{(n-1) !} x^{n-1} e^{-\theta x} .$

For any positive integers $a$ and $b$, the Beta distribution $B(a, b)$ has density $f_{B}$ defined on $(0,1)$ by

$f_{B}(x)=\frac{(a+b-1) !}{(a-1) !(b-1) !} x^{a-1}(1-x)^{b-1}$

Let $X$ and $Y$ be independent random variables with respective distributions $\Gamma(n, \theta)$ and $\Gamma(m, \theta)$. Show that the random variables $X /(X+Y)$ and $X+Y$ are independent and give their distributions.