Saturday, February 1, 2014

Gambler's Ruin Problem Part 1: Odds in Roulette

When I was at MAAthFest (a mathematics conference run by the Mathematical Association of America) last summer, I got to spend the first two days in a short course on the Mathematics of Games and Puzzles put on by Dr. Arthur Benjamin, who is actually my mentor in learning mental math. He recently came out with a video course through the Great Courses on the Mathematics of Games and Puzzles, which you can click here to see or purchase.

Last month, we did a four week series on Bertrand's Postulate. It was a fun and interesting proof, but it did require some hard concentration as well as relatively heavy algebra, number theory, and combinatorics. So, I thought looking at a lighter and more practical problem might be a good idea. It is certainly still deep and thought-provoking, but it won't require as much higher-level mathematics. This problem is called the "Gambler's Ruin Problem," and we spent a lot of time discussing it in that short course. This problem goes as follows:

Suppose that you find yourself in a city with a casino and you have $60 in your pocket. There is a concert in town that you really want to see, but the tickets cost $100. You decide that you will place $1 bets until you either reach $100 or go broke. Which casino game should you play? How likely are you to reach your goal of $100?

This problem can go on forever because there are so many casino games, each one with different betting options and strategies and variations. But it is also a really fun problem, because throughout the process of figuring it out, you will learn a lot about how to improve your skills at the casino. Though probability shows that you likely won't make money, you will learn ways to play for a longer period of time without going broke. And if you use these strategies enough times, you might just get lucky. As a side note, I find it ironic that I as a fourteen-year-old am choosing to write blog posts about casino games.

There are too many casino games to fully analyze in just a few posts, but we do have time to take a look at some of the most popular. This week, we will focus on roulette. Next week will be craps. February 15th will be blackjack. February 22nd will be video poker (I would have probably preferred to do Texas Hold'em, but there is so much strategy, bluffing, reading players, and chance involved that it would be way above my head to analyze odds in that game). Notice that I am choosing to ignore slot machines. This is because they are one of the worst bets one can make in the casino. Just look at the rearrangement of the letters!


So now let's start our basic analysis of roulette.

I'm sure many of you know the rules of roulette already, but I will quickly try to explain them to refresh everyone's memory. Though roulette can be played by many people at once, the game itself is just you against the house. Each spin of the wheel (at the bottom right corner of the picture above) is a game. The wheel is numbered from 1 to 36 with alternating red and black numbers. There is also a green zero as well as a green double-zero. The objective is to correctly guess a characteristic of the slot that the ball falls in to, whether it be the color, number, size, or parity (odd/even). Bets are placed on the green felt table, as shown more clearly in the picture below.

One of the more popular roulette bets is to bet on a color, say betting red. This bet pays evenly; you risk one dollar to gain one dollar (or whatever money amount you choose). When you bet red, there are 18 red numbers out of 38 total numbers, giving an 18/38 or 47.3% chance of winning.

To figure out how worthwhile this bet is, we do something called finding the expected value, which I discussed in my post about the Saint Petersburg Paradox. Expected value is essentially a weighted average; you average together your winning payoff and your losing payoff with the odds of achieving each one taken into account. For this example, winning a $1 bet would earn you 1 dollar and losing a $1 bet would cost you 1 dollar, or earn you -1 dollars. So, the calculation would be set up as follows:

EV($1 bet on red) = (18/38)(1) + (20/38)(–1) = -2/38 ≈ -0.0526316

In other words, each $1 bet on red will lose you 5.3¢ on average. Let's look at another bet, say betting on the first twelve numbers. This bet pays 2 to 1; you risk one dollar to gain two dollars. But, the odds of landing on one of those twelve are now 12/38 instead of 18/38.

Setting up the expected value equation gives us:

EV($1 bet on 1st 12) = (12/38)(2) + (26/38)(–1) = -2/38 ≈ -0.0526316

Again, the answer comes out to -2/38, or an average 5.3¢ loss. What if we were to place a bet on a single number. Let's say we bet on 26, the only natural number that is directly between a square (25 = 52) and a cube (27 = 33).  Casinos pay 35 to 1 on this bet; you risk one dollar to gain thirty-five dollars. The odds of landing on the number 26 would be 1/38. This expected value calculation would be:

EV($1 bet on 26) = (1/38)(35) + (37/38)(–1) = -2/38 ≈ -0.0526316

The answer is the same again: an average loss of 5.3¢. In fact, casinos choose the ratios for roulette such that every bet is the same expected value. Because of this, it does not matter to them what the players choose to bet on. They will always be making the same amount of money on average.

Let's return to the Gambler's Ruin Problem, and see how likely these odds are to achieve the $100. We mentioned before that the odds for winning with a bet on red is 47.3%. Though the odds differ with other bets, the properties of expected value end up making the game of roulette come out to a 47.3% chance of success. In other words, really close to 50-50, but just a hair below.

With the Gambler's Ruin Problem, a game with 0% odds would give 0% chance of success. A game with 100% odds would give 100% chance of success. Since you are starting with $60, a game of 50% odds actually gives a 60% chance of success. With a fair game, your amount of starting money determines your success interestingly enough.

What about a game of 47.3% odds? Would it be above 50% or below 50%? Sounds like it would be around that neck of the woods, considering that a 50-50 game gives a 60% chance. Turns out that there is a formula for determining the probability of turning $60 into $100.

Let p = probability of winning the game (in decimal form)
Let q = probability of losing the game (in decimal form)

Plugging 0.473 in for p and 0.527 in for q predicts the probability of success to be about 1.3%. This was really surprising to me at first. Just that 2.7% difference between fair and unfavorable costs you so much when it comes to succeeding in the Gambler's Ruin Problem. But probability is full of surprises, as you can see in my past probability posts as well as the posts in the rest of the month.

No comments:

Post a Comment