Historical Perspectives: The Role of Gambling in the Development of Probability

From Dice Problems to a New Mathematical Discipline

This spring, the Institute's History of Science working group launches a public lecture series titled "The Wager and the Theorem: How Gambling Forged Probability." The series, open to all, will trace the fascinating and often overlooked origins of probability theory directly to problems posed by gamblers and games of chance. The inaugural lecture, delivered by Professor Henry Glass, focused on the famous 1654 correspondence between Blaise Pascal and Pierre de Fermat, widely regarded as the founding moment of the field.

Professor Glass began by setting the scene: "In the mid-17th century, mathematics was largely the study of the certain—geometry, algebra, the motions of planets. The uncertain was the domain of fortune, fate, or God. There was no mathematics of randomness." This changed when the Chevalier de Méré, a French nobleman and avid gambler, posed two problems to Pascal: the Problem of Points (how to fairly divide the stakes of an interrupted game of chance) and the Problem of Dice (why he seemed to win when betting on at least one six in four rolls of a die, but lost when betting on at least one double-six in 24 rolls of two dice).

Deconstructing the Pascal-Fermat Letters

The lecture dove deep into the surviving fragments of the correspondence. "Pascal and Fermat approached the Problem of Points differently," Glass explained. "Fermat used what we might call an exhaustive enumeration of possible futures—listing all the ways the game could have ended. Pascal, perhaps more elegantly, used recursion and the concept of expectation. They arrived at the same answer, and in doing so, created the concept of expected value." This was revolutionary. It provided a rational, mathematical framework for making decisions under uncertainty, transforming gambling from pure superstition into a calculable enterprise.

The second problem, concerning the dice, led to the first correct calculations of probabilities for compound events. It exposed the fallacy of a naive 'linear' reasoning (4 rolls for one die, 24 for two) and necessitated a proper combinatorial analysis. "De Méré's intuition was wrong, but his question was profound. It forced a leap from counting favorable cases to calculating ratios of possibilities—the very definition of probability."

From Gaming Tables to the Modern World

The lecture then traced the rapid development that followed. Christiaan Huygens wrote the first textbook on probability, De Ratiociniis in Ludo Aleae (On Reasoning in Games of Chance), in 1657. Jacob Bernoulli's Ars Conjectandi (The Art of Conjecturing) introduced the law of large numbers. "Within a few decades," Glass noted, "the toolset invented to solve gambling problems was being applied to annuities, insurance, jurisprudence, and even the assessment of historical evidence."

The series will continue with lectures on: the role of insurance in developing mortality tables; the "St. Petersburg Paradox" and its impact on utility theory; and the statistical analyses of lotteries that funded early American colleges. "We study this history not as a quaint curiosity," Glass concluded, "but to understand the DNA of our discipline. The tension between pure chance and human decision, between mathematical abstraction and messy reality, was baked in from the very first letter between Pascal and Fermat. Here in Las Vegas, surrounded by the modern incarnations of those 17th-century games, we are perhaps the best place in the world to contemplate that legacy." The series promises to offer a rich, contextual understanding of why probability looks the way it does today.