Math Horizons - Volume 30, Issue 2, November 2022

Penney’s Game: Winning and Waiting

Robert Vallin 2022-10-26 15:59:53

In 1969, Walter Penney submitted the following problem to the Journal of Recreational Mathematics:

Although in a sequence of coin flips, any given consecutive set of, say, three flips is equally likely to be one of the eight possible, i.e., HHH, HHT, HTH, HTT, THH, THT, TTH, or TTT, it is rather peculiar that one sequence of three is not necessarily equally likely to appear first as another set of three. This fact can be illustrated by the following game: you and your opponent each ante a penny. Each selects a pattern of three, and the umpire tosses a coin until one of the two patterns appears, awarding the antes to the player who chose that pattern. Your opponent picks HHH; you pick HTH. The odds, you will find, are in your favor. By how much?

This problem is one of many paradoxical-seeming problems in probability such as the Monty Hall problem and the two-envelope paradox. We tend to think that there are two equally likely possible outcomes from a flip of a fair coin and that’s all you need to know. It doesn’t seem like different choices of three flips would affect the odds, but they do. We leave it to you to figure out why the odds are 3:2 in favor of the player who chooses HTH.

This counterintuitive result leads to what is today called Penney’s game, a two-player game played via the flipping of a fair coin. Player I picks a sequence of heads (H) or tails (T) of length k (it can be any length, but typically the game is played with length k = 3), making the choice known. Player II then chooses another sequence of length k. An umpire tosses the coin until one of the two sequences appears as a consecutive subsequence of the coin flips. The player whose sequence appears first wins.

The game looks innocent enough, but the paradox is lurking. Notice that Player II always states their preference after knowing Player I’s decision. This aspect of the game is truly important because it turns out that no matter which sequence Player I chooses, Player II can pick a sequence that puts the odds in their favor. Player II cannot guarantee a win, but they can guarantee a theoretical advantage.

We typically expect transitivity of choices in a game: if Choice A is better than Choice B and Choice B is superior to Choice C, then Choice A should be a better option than Choice C. This property does not hold in Penney’s game, so the game is an example of a nontransitive game. Many popular nontransitive games utilize nontransitive dice (see, for example, “What Nontransitive Dice Exist Among Us” by Heilman and Pasciuto from the April 2017 issue of Math Horizons).

Figure 1 depicts the eight possible outcomes of three flips of a coin, and the tail of an arrow indicates a choice with better odds. Notice that the original problem had the opponent choosing HHH and you choosing HTH. However, the diagram shows THH as the choice over HHH. It turns out that the odds in favor of THH over HHH are better than the odds in favor of HTH over HHH. The diagram shows only the best choice for Player II given Player I’s choice.

Figure 1. Best choices for Player II in Penney’s game.

Table 1. Optimal game play and odds in favor of Player II for these choices.

Table 1 lists of the best choices for Player II and the corresponding odds for each of the eight possible choices that can be made by Player I. At first glance, there may seem to be no pattern to Player II’s choice, but in fact there is a way for the second player to know what to choose. Can you figure out the pattern?

Here’s the hidden strategy. If Player I chooses the sequence (O1, O2, O3), then Player II should choose the sequence (the opposite of O2, O1, O2). For example, if Player I picks HTH, Player II should select HHT.

May the Odds be Ever in Your Favor

With just a bit of algebraic and probabilistic reasoning, we can explain the odds calculation for the three different sets of odds: HHH versus THH, HHT versus THH, and finally HTH versus HHT.

The HHH versus THH odds are the quickest to explain. Player I can only win if the first three flips are HHH. Once a T appears, a sequence of HHH cannot appear because THH would have appeared first, giving Player II the win. Thus, the probability that Player I wins is the probability of flipping three Hs in a row to start the game: HHH is one of eight sequences of three flips, so the probability that it occurs first is 1/8. This results in odds of 7:1 in favor of Player II.

To calculate the odds for HHT versus THH, we look from Player II’s point of view. The only way Player II can lose is if HH appears before the first T. But this only happens if HH appears as the first two flips. Because there are four total pairs of two flips, there is a 1/4 probability that HH appears as the first two flips, giving odds of 3:1 in favor of Player II. Note that from Player I’s point of view, they win if the first three tosses are HHT, or if the first four tosses are HHHT, of the first five are HHHHT, etc. This means that the probability that Player I wins is

because the outcomes are disjoint and each flip in an outcome is independent of the rest. These two calculations prove that

so Penney’s game proves this instance of the geometric series formula!

The final situation, HTH versus HHT, takes a bit more work than the other two cases. Because both outcomes start with H, if the umpire’s first n tosses are all T, this has no effect on the game. So we can ignore any leading Ts. Similarly, the first H to appear does not give either player an advantage; but what happens after the first H matters. We again look from Player I’s perspective. After the first occurrence of H, Player I needs the next flip to be a T, but this alone is not enough to guarantee a win. After the T, if an H appears Player I wins; if after the T a second T appears, then we are back to a situation with a string of two or more Ts in which neither player has an advantage. To recap, once the first H appears, if T and then H appear next then Player I wins, or if two Ts appear, then the game resets, as if we started over.

If X represents the probability that Player I wins, we have determined that X = (1/4) + (1/4)X. Solving this equation for X algebraically shows that the probability that Player I wins is 1/3 so that the odds in Player II’s favor are 2:1.

There are other methods to compute these odds. The late John Horton Conway presented a method that converted the players’ choices into binary representations and used those to compute odds. Conway’s method generalizes to versions of the game with more than three flips and even allows comparisons in which the two players choose outcomes of different lengths. In addition, Aaron Montgomery and I used martingales for these problems, which enabled us to extend the situation from coin flips to spins of a roulette wheel in which there are three outcomes (red, black, and green), and the probabilities of the outcomes are not the same (the probabilities of black and red are each 9/19, and the probability of green is 1/19). (See “Penney’s Game from Multiple Perspectives,” Proc. of the Rec. Math. Colloq. V [2017].)

The Wait Times Problem

Penney’s Game has many other hidden questions. Rather than having two players challenge each other, we can instead pick one outcome and ask the following question: How many flips would it take (on average) for the chosen sequence to appear? This is known as the Wait Times problem. Solving this problem motivated Conway to come up with his binary method for determining odds.

Let’s explore the Wait Times problem for the sequence HTH to get a feel for how to solve the problem in general. More specifically, we want to determine how many times, on average, the umpire will have to flip a fair coin until HTH appears. For ease of notation, we will call this outcome X and let E(X) denote the expected number of tosses until HTH appears. Now, we can compute E(X) by considering the first flip:

Figure 2. Modeling back-substitution in the Wait Times problem leads to a Markov chain.

where E(X | H1) is the expected time to wait for X to occur given that the first flip is H and E(X | T1) is the expected time to wait for X to occur given that the first flip is T. Because T is not the first outcome in our chosen sequence, the wait time in this situation is one more flip than the typical wait time for HTH so that E(X | T1) = 1 + E(X).

When the first flip is H, as needed for our chosen sequence, we then look at the second flip given the first flip is H. Half the time we expect a second occurrence of H and half the time the second outcome is a T, that is,

Again, because we want only the first flip to be H, the second H returns us to the situation in which our first flip is H, and it took us one extra toss to get there. This implies that E(X | H1H2) = 1 + E(X | H1).

If the second flip is T like we need, then we move on to the third flip to see that

If the first three outcomes are HTT, the process of waiting for HTH starts over again and we have used three moves more than expected. Thus E(X | H1T2T3) = 3 + E(X). Meanwhile E(X | H1T2T3) = 3 as this represents how many total tosses it takes for HTH to appear if the first three tosses are HTH.

Using the process of back-substitution, we obtain

Further back-substitution yields

so that

. Coupling this with E(X | T1) = 1 + E(X), we get E(X) = 10. On average, we expect to wait 10 flips for HTH to appear first.

The average wait times for both HTH and THT are also 10 flips using similar reasoning. Both TTT and HHH have the longest average wait time, 14 flips, while all others have an average wait time of 8 tosses. We encourage the reader to verify these using this back-substitution technique.

A Deeper Cut

If we attempt to model the back-substitution process graphically, we produce a graphical representation of a Markov chain (figure 2). In the diagram, each edge represents a transition after a flip (keeping track of at most the last three flips). The edges labeled H or T indicate an H or T flip and occur with probability 1/2. The edge labeled W signifies a “win” and occurs with probability 1. The final node with the W edge is known as an absorbing state because we never leave that state (the game is over). Each node has either one or two edges leaving and the sum of the probabilities of those edges is 1.

We can embed the information from the graph in figure 2 in the 4 × 4 matrix

Each row and column correspond to a vertex and the entry in row i column j is the probability of moving from the vertex corresponding to i to the vertex corresponding to j in a flip. Note that this matrix has the form

where I is an identity matrix, 0 is a zero matrix, and Q is the 3 × 3 matrix

The matrix V is an irrelevant matrix for our purposes (but see “Horseshoe” by Dave Richeson from this issue to see a related computation that utilizes the analogous matrix).

We can then use any computer algebra system, such as SageMath (sagemath.org), to compute the matrix A = (I3 – Q)−1, where I3 is the 3 × 3 identity matrix, yielding

An amazing fact about the matrix A = (I3 – Q)−1 is that the sum of the entries in the first row gives you the wait time! If you’d like to know more about this, we recommend “Chutes and Ladders for the Impatient” by Cheyetan, Hengeveld, and Jones from the January 2011 issue of The College Mathematics Journal. Try modifying this Markov approach to find the average wait time when choosing either TTT or HHH.

Concluding Thoughts

Here, we have seen a simply stated problem with, at first glance, preposterous results. Our investigation led to interesting probability calculations for related games and even a glimpse of the deeper interactions between linear algebra and probability. All of this occurred naturally by starting with some interesting, but not too deep, mathematical ideas. Mathematics is rife with such results and problems for you to work on. It’s just a matter of going out there and finding them.


Robert Vallin is currently a professor at Lamar University in Beaumont, Texas, and is the founder and past chair of the SIGMAA on Recreational Mathematics. He spends his free time learning close-up magic, hanging with his cats and children, and avoiding gators.

Solutions to the puzzles on page 2

©Taylor and Francis. View All Articles.

Penney’s Game: Winning and Waiting
https://digitaleditions.sheridan.com/article/Penney%E2%80%99s+Game%3A+Winning+and+Waiting/4372753/766255/article.html

Menu
  • Page View
  • Contents View
  • Archive
  • Advertisers
  • Website
  • Facebook
  • Twitter
  • LinkedIn
  • Instagram

Issue List

Volume 33, issue 3 February 2026

Volume 33, issue 2 November 2025

Volume 33, issue 1 September 2025

Volume 32, Issue 4, April 2025

Volume 32, Issue 3, February 2025

Volume 32, Issue 2, November 2024

Volume 32, Issue 1 September 2024

Volume 31, Issue 4, April 2024

Volume 31, Issue 3, February 2024

Volume 31, Issue 2, November 2023

Volume 31, Issue 1, September 2023

Volume 30, Issue 4, April 2023

Volume 30, Issue 3, February 2023

Volume 30, Issue 2, November 2022

Volume 30, Issue 1, Aug 2022

Volume 29, Issue 4, Apr 2022

Volume 29, Issue 3, Mar 2022

Volume 29, Issue 2, Nov 2021

Volume XXIX, Issue 1, SEPTEMBER 2021

Volume XXVIII, Issue 4, APRIL 2021

Volume XXVIII, Issue 3, FEBRUARY 2021

Volume XXVIII, Issue 2, NOVEMBER 2020

Volume XXVIII, Issue 1, SEPTEMBER 2020

Volume XXVII, Issue 4, APRIL 2020

Volume XXVII, Issue 3, February 2020

Volume XXVII, Issue 2, November 2019

Volume XXVII, Issue 1, September 2019

Volume xxvi, Issue 4, March 2019

Volume XXVI, Issue 3, February 2019

Volume XXVI, Issue 2, November 2018

VOLUME 26, NUMBER 1, 2018

Volume XXV, Issue 4, April 2018

Volume XXV, Issue 3, February 2018


Library