Probability
Jump to year
Paper 2, Section I, D
commentA coin has probability of landing heads. Let be the probability that the number of heads after tosses is even. Give an expression for in terms of . Hence, or otherwise, find .
Paper 2, Section I, F
commentLet be a continuous random variable taking values in . Let the probability density function of be
where is a constant.
Find the value of and calculate the mean, variance and median of .
[Recall that the median of is the number such that
Paper 2, Section II, 10E
comment(a) Alanya repeatedly rolls a fair six-sided die. What is the probability that the first number she rolls is a 1 , given that she rolls a 1 before she rolls a
(b) Let be a simple symmetric random walk on the integers starting at , that is,
where is a sequence of IID random variables with . Let be the time that the walk first hits 0 .
(i) Let be a positive integer. For , calculate the probability that the walk hits 0 before it hits .
(ii) Let and let be the event that the walk hits 0 before it hits 3 . Find . Hence find .
(iii) Let and let be the event that the walk hits 0 before it hits 4 . Find .
Paper 2, Section II, 12F
commentState and prove Chebyshev's inequality.
Let be a sequence of independent, identically distributed random variables such that
for some , and let be a continuous function.
(i) Prove that
is a polynomial function of , for any natural number .
(ii) Let . Prove that
where is the set of natural numbers such that .
(iii) Show that
as . [You may use without proof that, for any , there is a such that for all with .]
Paper 2, Section II, 9E
comment(a) (i) Define the conditional probability of the event given the event . Let be a partition of the sample space such that for all . Show that, if ,
(ii) There are urns, the th of which contains red balls and blue balls. Alice picks an urn (uniformly) at random and removes two balls without replacement. Find the probability that the first ball is blue, and the conditional probability that the second ball is blue, given that the first is blue. [You may assume, if you wish, that .]
(b) (i) What is meant by saying that two events and are independent? Two fair (6-sided) dice are rolled. Let be the event that the sum of the numbers shown is , and let be the event that the first die shows . For what values of and are the two events and independent?
(ii) The casino at Monte Corona features the following game: three coins each show heads with probability and tails otherwise. The first counts 10 points for a head and 2 for a tail; the second counts 4 points for both a head and a tail; and the third counts 3 points for a head and 20 for a tail. You and your opponent each choose a coin. You cannot both choose the same coin. Each of you tosses your coin once and the person with the larger score wins the jackpot. Would you prefer to be the first or the second to choose a coin?
Paper 2, Section II, D
commentLet be the disc of radius 1 with centre at the origin . Let be a random point uniformly distributed in . Let be the polar coordinates of . Show that and are independent and find their probability density functions and .
Let and be three random points selected independently and uniformly in . Find the expected area of triangle and hence find the probability that lies in the interior of triangle .
Find the probability that and are the vertices of a convex quadrilateral.
Paper 1, Section I, F
commentA robot factory begins with a single generation-0 robot. Each generation- robot independently builds some number of generation- robots before breaking down. The number of generation- robots built by a generation- robot is or 3 with probabilities and respectively. Find the expectation of the total number of generation- robots produced by the factory. What is the probability that the factory continues producing robots forever?
[Standard results about branching processes may be used without proof as long as they are carefully stated.]
Paper 1, Section II, F
comment(a) Let be a random variable. Write down the probability density function (pdf) of , and verify that it is indeed a pdf. Find the moment generating function (mgf) of and hence, or otherwise, verify that has mean 0 and variance 1 .
(b) Let be a sequence of IID random variables. Let and let . Find the distribution of .
(c) Let . Find the mean and variance of . Let and let .
If is a sequence of random variables and is a random variable, what does it mean to say that in distribution? State carefully the continuity theorem and use it to show that in distribution.
[You may not assume the central limit theorem.]
Paper 1, Section II, F
commentLet be events in some probability space. State and prove the inclusion-exclusion formula for the probability . Show also that
Suppose now that and that whenever we have . Show that there is a constant independent of such that .
Paper 2, Section I, 3F
comment(a) Prove that as .
(b) State Stirling's approximation for !.
(c) A school party of boys and girls travel on a red bus and a green bus. Each bus can hold children. The children are distributed at random between the buses.
Let be the event that the boys all travel on the red bus and the girls all travel on the green bus. Show that
Paper 2, Section I, F
commentLet and be independent exponential random variables each with parameter 1 . Write down the joint density function of and .
Let and . Find the joint density function of and .
Are and independent? Briefly justify your answer.
Paper 2, Section II, F
commentLet be events in some probability space. Let be the number of that occur (so is a random variable). Show that
and
[Hint: Write where .]
A collection of lightbulbs are arranged in a circle. Each bulb is on independently with probability . Let be the number of bulbs such that both that bulb and the next bulb clockwise are on. Find and .
Let be the event that there is at least one pair of adjacent bulbs that are both on.
Use Markov's inequality to show that if then as .
Use Chebychev's inequality to show that if then as .
Paper 2, Section II, F
commentRecall that a random variable in is bivariate normal or Gaussian if is normal for all . Let be bivariate normal.
(a) (i) Show that if is a real matrix then is bivariate normal.
(ii) Let and . Find the moment generating function of and deduce that the distribution of a bivariate normal random variable is uniquely determined by and .
(iii) Let and for . Let be the correlation of and . Write down in terms of some or all of and . If , why must and be independent?
For each , find . Hence show that for some normal random variable in that is independent of and some that should be specified.
(b) A certain species of East Anglian goblin has left arm of mean length with standard deviation , and right arm of mean length with standard deviation . The correlation of left- and right-arm-length of a goblin is . You may assume that the distribution of left- and right-arm-lengths can be modelled by a bivariate normal distribution. What is the probability that a randomly selected goblin has longer right arm than left arm?
[You may give your answer in terms of the distribution function of a random variable . That is, .J
Paper 2, Section II, F
commentLet and be positive integers with and let be a real number. A random walk on the integers starts at . At each step, the walk moves up 1 with probability and down 1 with probability . Find, with proof, the probability that the walk hits before it hits 0 .
Patricia owes a very large sum !) of money to a member of a violent criminal gang. She must return the money this evening to avoid terrible consequences but she only has !. She goes to a casino and plays a game with the probability of her winning being . If she bets on the game and wins then her is returned along with a further ; if she loses then her is lost.
The rules of the casino allow Patricia to play the game repeatedly until she runs out of money. She may choose the amount that she bets to be any integer a with , but it must be the same amount each time. What choice of would be best and why?
What choice of would be best, and why, if instead the probability of her winning the game is ?
Paper 2, Section II, F
comment(a) State the axioms that must be satisfied by a probability measure on a probability space .
Let and be events with . Define the conditional probability .
Let be pairwise disjoint events with for all and . Starting from the axioms, show that
and deduce Bayes' theorem.
(b) Two identical urns contain white balls and black balls. Urn I contains 45 white balls and 30 black balls. Urn II contains 12 white balls and 36 black balls. You do not know which urn is which.
(i) Suppose you select an urn and draw one ball at random from it. The ball is white. What is the probability that you selected Urn I?
(ii) Suppose instead you draw one ball at random from each urn. One of the balls is white and one is black. What is the probability that the white ball came from Urn I?
(c) Now suppose there are identical urns containing white balls and black balls, and again you do not know which urn is which. Each urn contains 1 white ball. The th urn contains black balls . You select an urn and draw one ball at random from it. The ball is white. Let be the probability that if you replace this ball and again draw a ball at random from the same urn then the ball drawn on the second occasion is also white. Show that as
Paper 2, Section I, F
comment(a) State the Cauchy-Schwarz inequality and Markov's inequality. State and prove Jensen's inequality.
(b) For a discrete random variable , show that implies that is constant, i.e. there is such that .
Paper 2, Section I, F
commentLet and be independent Poisson random variables with parameters and respectively.
(i) Show that is Poisson with parameter .
(ii) Show that the conditional distribution of given is binomial, and find its parameters.
Paper 2, Section II, 10F
comment(a) Let and be independent random variables taking values , each with probability , and let . Show that and are pairwise independent. Are they independent?
(b) Let and be discrete random variables with mean 0 , variance 1 , covariance . Show that .
(c) Let be discrete random variables. Writing , show that .
Paper 2, Section II, F
commentFor a symmetric simple random walk on starting at 0 , let .
(i) For and , show that
(ii) For , show that and that
(iii) Prove that .
Paper 2, Section II, F
comment(a) Consider a Galton-Watson process . Prove that the extinction probability is the smallest non-negative solution of the equation where . [You should prove any properties of Galton-Watson processes that you use.]
In the case of a Galton-Watson process with
find the mean population size and compute the extinction probability.
(b) For each , let be a random variable with distribution . Show that
in distribution, where is a standard normal random variable.
Deduce that
Paper 2, Section II, F
comment(a) Let and be independent discrete random variables taking values in sets and respectively, and let be a function.
Let . Show that
Let . Show that
(b) Let be independent Bernoulli random variables. For any function , show that
Let denote the set of all sequences of length . By induction, or otherwise, show that for any function ,
where and .
Paper 2, Section I, F
commentLet and be real-valued random variables with joint density function
(i) Find the conditional probability density function of given .
(ii) Find the expectation of given .
Paper 2, Section I, F
commentLet be a non-negative integer-valued random variable such that .
Prove that
[You may use any standard inequality.]
Paper 2, Section II, 10F
comment(a) For any random variable and and , show that
For a standard normal random variable , compute and deduce that
(b) Let . For independent random variables and with distributions and , respectively, compute the probability density functions of and .
Paper 2, Section II, 12F
comment(a) Let . For , let be the first time at which a simple symmetric random walk on with initial position at time 0 hits 0 or . Show . [If you use a recursion relation, you do not need to prove that its solution is unique.]
(b) Let be a simple symmetric random walk on starting at 0 at time . For , let be the first time at which has visited distinct vertices. In particular, . Show for . [You may use without proof that, conditional on , the random variables have the distribution of a simple symmetric random walk starting at .]
(c) For , let be the circle graph consisting of vertices and edges between and where is identified with 0 . Let be a simple random walk on starting at time 0 from 0 . Thus and conditional on the random variable is with equal probability (identifying with ).
The cover time of the simple random walk on is the first time at which the random walk has visited all vertices. Show that .
Paper 2, Section II, F
commentLet . The Curie-Weiss Model of ferromagnetism is the probability distribution defined as follows. For , define random variables with values in such that the probabilities are given by
where is the normalisation constant
(a) Show that for any .
(b) Show that . [You may use for all without proof. ]
(c) Let . Show that takes values in , and that for each the number of possible values of such that is
Find for any .
Paper 2, Section II, F
commentFor a positive integer , and , let
(a) For fixed and , show that is a probability mass function on and that the corresponding probability distribution has mean and variance .
(b) Let . Show that, for any ,
Show that the right-hand side of is a probability mass function on .
(c) Let and let with . For all , find integers and such that
[You may use the Central Limit Theorem.]
Paper 2, Section I,
commentDefine the moment-generating function of a random variable . Let be independent and identically distributed random variables with distribution , and let . For , show that
Paper 2, Section I, F
commentLet be independent random variables, all with uniform distribution on . What is the probability of the event ?
Paper 2, Section II, F
commentA random graph with nodes is drawn by placing an edge with probability between and for all distinct and , independently. A triangle is a set of three distinct nodes that are all connected: there are edges between and , between and and between and .
(a) Let be the number of triangles in this random graph. Compute the maximum value and the expectation of .
(b) State the Markov inequality. Show that if , for some , then when
(c) State the Chebyshev inequality. Show that if is such that when , then when
Paper 2, Section II, F
commentLet be a non-negative random variable such that is finite, and let .
(a) Show that
(b) Let and be random variables such that and are finite. State and prove the Cauchy-Schwarz inequality for these two variables.
(c) Show that
Paper 2, Section II, F
commentWe randomly place balls in bins independently and uniformly. For each with , let be the number of balls in bin .
(a) What is the distribution of ? For , are and independent?
(b) Let be the number of empty bins, the number of bins with two or more balls, and the number of bins with exactly one ball. What are the expectations of and ?
(c) Let , for an integer . What is ? What is the limit of when ?
(d) Instead, let , for an integer . What is ? What is the limit of when ?
Paper 2, Section II, F
commentFor any positive integer and positive real number , the Gamma distribution has density defined on by
For any positive integers and , the Beta distribution has density defined on by
Let and be independent random variables with respective distributions and . Show that the random variables and are independent and give their distributions.
Paper 2, Section I, F
commentLet be events in the sample space such that and . The event is said to attract if the conditional probability is greater than , otherwise it is said that repels . Show that if attracts , then attracts . Does repel
Paper 2, Section I, F
commentLet be a uniform random variable on , and let .
(a) Find the distribution of the random variable .
(b) Define a new random variable as follows: suppose a fair coin is tossed, and if it lands heads we set whereas if it lands tails we set . Find the probability density function of .
Paper 2, Section II, F
commentWhen coin is tossed it comes up heads with probability , whereas coin comes up heads with probability . Suppose one of these coins is randomly chosen and is tossed twice. If both tosses come up heads, what is the probability that coin was tossed? Justify your answer.
In each draw of a lottery, an integer is picked independently at random from the first integers , with replacement. What is the probability that in a sample of successive draws the numbers are drawn in a non-decreasing sequence? Justify your answer.
Paper 2, Section II, F
commentState and prove Markov's inequality and Chebyshev's inequality, and deduce the weak law of large numbers.
If is a random variable with mean zero and finite variance , prove that for any ,
[Hint: Show first that for every .]
Paper 2, Section II, F
commentConsider the function
Show that defines a probability density function. If a random variable has probability density function , find the moment generating function of , and find all moments , .
Now define
Show that for every ,
Paper 2, Section II, F
commentLionel and Cristiana have and million pounds, respectively, where . They play a series of independent football games in each of which the winner receives one million pounds from the loser (a draw cannot occur). They stop when one player has lost his or her entire fortune. Lionel wins each game with probability and Cristiana wins with probability , where . Find the expected number of games before they stop playing.
Paper 2, Section I, F
commentConsider independent discrete random variables and assume exists for all .
Show that
If the are also positive, show that
Paper 2, Section I, F
commentConsider a particle situated at the origin of . At successive times a direction is chosen independently by picking an angle uniformly at random in the interval , and the particle then moves an Euclidean unit length in this direction. Find the expected squared Euclidean distance of the particle from the origin after such movements.
Paper 2, Section II, 9F
commentState the axioms of probability.
State and prove Boole's inequality.
Suppose you toss a sequence of coins, the -th of which comes up heads with probability , where . Calculate the probability of the event that infinitely many heads occur.
Suppose you repeatedly and independently roll a pair of fair dice and each time record the sum of the dice. What is the probability that an outcome of 5 appears before an outcome of 7 ? Justify your answer.
Paper 2, Section II, F
commentGive the definition of an exponential random variable with parameter . Show that is memoryless.
Now let be independent exponential random variables, each with parameter . Find the probability density function of the random variable and the probability .
Suppose the random variables are independent and each has probability density function given by
Find the probability density function of [You may use standard results without proof provided they are clearly stated.]
Paper 2, Section II, F
commentFor any function and random variables , the "tower property" of conditional expectations is
Provide a proof of this property when both are discrete.
Let be a sequence of independent uniform -random variables. For find the expected number of 's needed such that their sum exceeds , that is, find where
[Hint: Write
Paper 2, Section II, F
commentDefine what it means for a random variable to have a Poisson distribution, and find its moment generating function.
Suppose are independent Poisson random variables with parameters . Find the distribution of .
If are independent Poisson random variables with parameter , find the distribution of . Hence or otherwise, find the limit of the real sequence
[Standard results may be used without proof provided they are clearly stated.]
Paper 2, Section I, F
comment(i) Let be a random variable. Use Markov's inequality to show that
for all and real .
(ii) Calculate in the case where is a Poisson random variable with parameter . Using the inequality from part (i) with a suitable choice of , prove that
for all .
Paper 2, Section I, F
commentLet be a random variable with mean and variance . Let
Show that for all . For what value of is there equality?
Let
Supposing that has probability density function , express in terms of . Show that is minimised when is such that .
Paper 2, Section II, F
commentLet be the sample space of a probabilistic experiment, and suppose that the sets are a partition of into events of positive probability. Show that
for any event of positive probability.
A drawer contains two coins. One is an unbiased coin, which when tossed, is equally likely to turn up heads or tails. The other is a biased coin, which will turn up heads with probability and tails with probability . One coin is selected (uniformly) at random from the drawer. Two experiments are performed:
(a) The selected coin is tossed times. Given that the coin turns up heads times and tails times, what is the probability that the coin is biased?
(b) The selected coin is tossed repeatedly until it turns up heads times. Given that the coin is tossed times in total, what is the probability that the coin is biased?
Paper 2, Section II, F
commentLet be a geometric random variable with . Derive formulae for and in terms of
A jar contains balls. Initially, all of the balls are red. Every minute, a ball is drawn at random from the jar, and then replaced with a green ball. Let be the number of minutes until the jar contains only green balls. Show that the expected value of is . What is the variance of
Paper 2, Section II, F
commentLet be a random variable taking values in the non-negative integers, and let be the probability generating function of . Assuming is everywhere finite, show that
where is the mean of and is its variance. [You may interchange differentiation and expectation without justification.]
Consider a branching process where individuals produce independent random numbers of offspring with the same distribution as . Let be the number of individuals in the -th generation, and let be the probability generating function of . Explain carefully why
Assuming , compute the mean of . Show that
Suppose and . Compute the probability that the population will eventually become extinct. You may use standard results on branching processes as long as they are clearly stated.
Paper 2, Section II, F
commentLet be an exponential random variable with parameter . Show that
for any .
Let be the greatest integer less than or equal to . What is the probability mass function of ? Show that .
Let be the fractional part of . What is the density of ?
Show that and are independent.
Paper 2, Section I, F
commentDefine the probability generating function of a random variable taking values in the non-negative integers.
A coin shows heads with probability on each toss. Let be the number of tosses up to and including the first appearance of heads, and let . Find the probability generating function of .
Show that where .
Paper 2, Section I, F
commentGiven two events and with and , define the conditional probability .
Show that
A random number of fair coins are tossed, and the total number of heads is denoted by . If for , find .
Paper 2, Section II, F
commentLet be independent random variables with distribution functions . Show that have distribution functions
Now let be independent random variables, each having the exponential distribution with parameter 1. Show that has the exponential distribution with parameter 2 , and that is independent of .
Hence or otherwise show that has the same distribution as , and deduce the mean and variance of .
[You may use without proof that has mean 1 and variance 1.]
Paper 2, Section II, F
(i) Let be the size of the generation of a branching process with familysize probability generating function , and let . Show that the probability generating function of satisfies for .
(ii) Suppose the family-size mass function is Find , and show that
Deduce the value of