Heads, Tails and Bumpsdaisy, revisited
At a recent East Dorset Mathsjam , the puzzle of two heads resurfaced: if you repeatedly flip a fair coin, how long (on average) do you have to wait until you get two heads in a row?
Two fine answers are available here.
However, the estimable Barney Maunder-Taylor went down a different track, noting that there was a probability of:
- $\frac{1}{4}$ of taking two tosses;
- $\frac{1}{8}$ of taking three tosses;
- $\frac{2}{16}$ of taking four tosses;
- … and the sequence continued $\frac{3}{32}$, $\frac{5}{64}$, $\frac{8}{128}$, and so on.
It’s the Fibonacci sequence multiplied by powers of $\frac{1}{2}$.
My favourite toy
My favourite toy just now is the probability generating function: a way of keeping track of discrete distributions using polynomials.
For example, the generating function corresponding to Barney’s distribution could be written as $b(x) = \frac{1}{4}x^2 + \frac{1}{8}x^3 + \frac{2}{16}x^4 + \frac{3}{32}x^5 + \frac{5}{64}x^6 + \frac{8}{128}x^7 + …$
One of the things that makes generating functions so useful is that there are frequently different ways to write them. This is an infinite series - can we find a compact way to write it?
I wouldn’t be asking if we couldn’t
Using the intuition that the Fibonacci sequence can be formed by adding terms together, and noting that in this case that there’s a factor of two involved, we find:
- $\frac{x}{2} b(x) = \frac{1}{8}x^3 + \frac{1}{16}x^4 + \frac{2}{32}x^5 + \frac{3}{64}x^6 + …$
- $\frac{x^2}{4}b(x) = \frac{1}{16}x^4 + \frac{1}{32}x^5 + \frac{2}{64}x^6 + …$
Adding these together gives $\br{\frac{x}{2} + \frac{x^2}{4}}b(x) = \frac{1}{8}x^3 + \frac{2}{16}x^4 + \frac{3}{32}x^5 + \frac{5}{64}x^6 + …$, which differs from $b(x)$ only by $\frac{1}{4}x^2$.
So, we have $b(x) = \br{\frac{x}{2} + \frac{x^2}{4}}b(x) + \frac{1}{4}x^2$, or $b(x)\br{1- \frac{1}{2}x - \frac{1}{4}x^2} = \frac{1}{4}x^2$.
Multiplying by 4 and dividing through by the quadratic factor gives $b(x) = \frac{x^2}{4 - 2x - x^2}$.
(Since $b(1) = 1$, we’re reasonably happy that this is a PGF.)
But what about the expected value?
That’s simple: because $E(X) = \sum_k k p(X=k)$, and $p(X=k)$ is the $k$th term of the expansion of $b(x)$, $E(X) = b’(x)$ evaluated for $x=1$!
The obvious way to work out that derivative is with the quotient rule and a lot of algebra. You can do it piratically without the algebra, though:
- $u = x^2$ so $u’ = 2x$
- $v = 4- 2x - x^2$ so $v’ = -2 - 2x$.
Evaluated at $x=1$, $u = 1$, $u’ = 2$, $v = 1$ and $v’ = -4$.
Using the quotient rule $b’(1) = \frac{(1)(2) - (1)(-4)}{1^2} = 6$, the answer we found elsewhere.