# 14 Gambling Probability Examples

Published on December 09, 2016

By Randy Ray

Published on December 09, 2016

Published on December 09, 2016

I’m writing a post with 14 gambling probability examples because I think that examples are one of the easiest ways to teach something. Probability is a branch of mathematics, and a lot of people have trouble with math. But calculating the odds that something will or won’t happen is a lot easier than you think.

Here are the 14 examples and how they relate to probability.

The first concept to understand is that probability is something that applies to random events. It’s a way of measuring mathematically how likely it is that you’ll see a certain event happen or not happen.

And the first thing to know is that a probability is always a number between 0 and 1. I’ll go into that a little bit more in the next bullet point, but for now, you need to understand that an event that will never occur has a probability of 0. An event that will always occur has a probability of 1.

Anything that might or might not happen can be measured with a number between 0 and 1. And that number is easily calculated.

You take the number of ways an event can happen and divide it by the total number of events possible.

Here’s an easy example:

You flip a coin. There are 2 possible outcomes, both of which are equally likely. You want to know the probability of the coin landing on heads. There’s only one way for it to land on heads, so the probability is ½. That can also be expressed as 0.5, 50%, or 1 to 1.

More about that in the next bullet point.

All probabilities are, by their nature, fractions. But there are multiple ways of expressing a fraction. I gave examples in #1 related to the toss of a coin.

Let’s look at a different example. Let’s say that you have a 2-headed coin.

What is the probability that the coin will land on heads?

And what is the probability that it will land on tails?

Since there are 2 heads, you have a 100% chance of getting a result of heads, and a 0% chance of getting a result of tails.

You can also express that as 1/1 (which is just 1). Or you could express it as 1.0. But most people are comfortable expressing it as a percentage.

In *Dungeons & Dragons*, you use different shaped dice with more or fewer sides than just six. Let’s use a 4-sided die for this example.

What are the odds of rolling a 4 on a 4-sided die that’s numbered 1-4?

There’s only 1 way to get that result, but there are 4 possible results.

Expressed as a fraction, that’s ¼. Expressed as a decimal, it’s 0.25. Expressed as a percentage, it’s 25%. And expressed as odds, it’s 3 to 1. (Odds are expressed by the number of ways it can’t happen versus the number of ways it can.)

Let’s go back to the coin toss example. I’m going to use it as an example for how to calculate the probability of multiple events.

Suppose you want to know the probability of getting heads twice in a row. When you’re calculating the probability of one event happening AND another event happening, you multiply the two probabilities together.

In this case, we have a 0.5 probability of getting heads. 0.5 X 0.5 = 0.25, which is the probability of getting heads twice in a row.

If you look at the total number of possibilities, it’s easy to see why this is the case:

- You could get tails twice.
- You could get heads twice.
- You could get tails on the first toss and heads on the second toss.
- You could get heads on the first toss and tails on the second toss.

Those are the 4 options, and only one of them is the desired result.

Let’s look at another probability example using a traditional six-sided die. The die has 6 possible results, 1, 2, 3, 4, 5, or 6.

The probability of any specific one of those numbers coming up is 1/6.

But what if you want to calculate the probability of getting a 1 OR a 2?

When you’re calculating probabilities that use “AND” in the problem, you multiply.

But if you’re using “OR”, you add.

The probability of getting a 1 is 1/6. The probability of getting a 2 is also 1/6.

1/6 + 1/6 = 2/6, which can be reduced to 1/3.

You can also express that as 33.33% or 0.33 or as 2 to 1 odds.

All kinds of probabilities come into play in card games, but they’re all based on a handful of qualities that a deck of cards has.

A standard deck of cards consists of 52 cards. The probability of getting any specific card is 1/52.

The cards are divided into 4 different suits—clubs, diamonds, hearts, and spades. If you want to know the probability of getting a card of a specific suit, it’s ¼.

The cards are also divided into 13 different ranks: 2, 3, 4, 5, 6, 7, 8, 9, 10, jack, queen, king, and ace.

The probability of getting a card of a specific rank is 1/13.

But one of the interesting things about card games is that the dealing of cards changes the probability of getting subsequent cards.

You’re playing blackjack with a single deck, and you’ve been dealt an ace as your first card. What is the probability that the next card will also be an ace?

There are only 3 aces left in the deck. And the deck only has 51 cards left, too.

So the probability is 3/51, or 1/17, or 16 to 1.

Here’s another example:

You’re playing 5 card stud. You have 3 aces and a deuce. You only have one opponent, and he has 4 cards, none of which is an ace.

There is only one ace left in the deck. And there are 44 cards left in the deck.

So the probability of getting the last ace on your next card is 1/44, or 43 to 1.

An American roulette wheel has 38 pockets in which the ball can land. The probability of the ball landing in a specific pocket is 1/38. This can also be expressed as 37 to 1.

This is a great example, because it demonstrates how the casino gets its edge over the player. This bet pays off at 35 to 1.

Since the odds of winning are 37 to 1 and the payoff is only 35 to 1, over a long enough period of time, the casino will almost surely win a lot of money on this bet.

And every bet on the roulette wheel offers a payoff that isn’t commensurate with the odds of winning.

I’ll discuss the house edge more later, but let’s look at some of the other probabilities offered on a roulette wheel.

The pockets aren’t just numbered. They’re also colored. 2 of them are green, 18 of them are black, and 18 of them are red.

If you want to know the probability of the ball landing in a green pocket, it’s 2/38, or 1/19, or 18 to 1.

If you want to know the probability of the ball landing in a black pocket, it’s 18/38, or 9/19, or 37 to 18.

The probability of the ball landing in the red pocket is the same.

You can also place bets on whether the ball lands on an odd or even number. The 0 and the 00 (the green pockets) qualify as neither. So the odds of winning odd or even are the same as winning a bet on black or red.

Those are just some examples. There are a dizzying number of roulette bets you can place.

The house edge is the percentage of each bet that the casino expects to win over the long run. The easiest way to calculate this number is to assume that someone is playing for $100 a bet, and determining how much he’ll lose on average over the long run, then divide that into the number of bets.

I’ll use roulette as an example again, because it’s illustrative.

You place 38 bets on a single number at the roulette table. If you had a mathematically perfect set of spins (which is what you want to assume when you’re calculating the house edge), you’ll win once and lose 37 times.

The winning bet pays off at 35 to 1, so you’ll win $3500 on it. But you’ll lose $3700 on the 37 losing spins. Your net loss is $3700 – $3500, or $200.

If you divide $200 by 38, you get $5.26 per spin, which is your average loss per spin.

That is the house edge for the game—5.26%.

It’s important to remember that the house edge is a long term mathematical expectation. In the short term, anything can (and often will) happen.

But once you get into tens of thousands of results, you’ll start to see the actual results trending toward the expected results.

The casino relies on the law of large numbers for their business model. Players are hoping to have the occasional short-term run of luck. The casino knows that these short-term runs of luck are more than compensated for by the actual expected results on all the other bets being placed constantly throughout the casino.

There’s a big difference between a betting system and a gambling strategy. A betting system is a method of increasing and reducing your bets based on your previous results. These betting systems don’t work because they rely on the gamblers fallacy.

The gamblers fallacy is the belief that previous events somehow affect the probability of future events. In the case of truly independent trials, previous results have no effect.

You’re playing roulette, and you’ve noticed that the ball has landed in a red pocket 8 times in a row. What is the probability that it will land in a red pocket on the next spin?

Since there are still 38 pockets, and 18 of them are still red, the probability is the same—18/38.

Since there are still 38 pockets, and 18 of them are still red, the probability is the same—18/38.

This is not to be confused with the question of what the probability is that the ball will land on red 9 times in a row. That’s a different question. You’re not placing a bet on whether or not the ball is going to land on red 9 times in a row. You’re placing a bet on the next spin, which is an individual event.

A lot of gamblers have trouble with the concept of independent events.

So they create betting systems which assume that these previous results affect future results.

The classic example is the Martingale System.

The Martingale System is the classic example of a betting system based on the gamblers fallacy.

The idea is simple enough. You place a single bet on red or black at the roulette table. If you win, you pocket your winnings. If you lose, you double your bet on the next spin. If you win that time, you’ve profited a single unit.

If you lose again, you double your bet again.

This is called a progressive betting system because your bets grow progressively larger.

But in the long run, it doesn’t work.

One of two things will happen if you use this system long enough:

- You’ll run into a losing streak long enough that you won’t be able to afford the next bet in the progression.
- You’ll run into a losing streak long enough that the casino won’t let you bet enough to cover your losses.

You bet $5 on black. You lose.

You then bet $10 on black. You lose again.

You then bet $20 on black. You lose again.

You then bet $40 on black. This time you win.

You’ve lost $35 up to this point, so your net win is $5 (your original betting unit).

You then bet $10 on black. You lose again.

You then bet $20 on black. You lose again.

You then bet $40 on black. This time you win.

You’ve lost $35 up to this point, so your net win is $5 (your original betting unit).

What most players seem to lose sight of is how easy it is to run into a streak—it happens more often than they think.

Here’s what the progression looks like:

- $5
- $10
- $20
- $40
- $80
- $160
- $320
- $640
- $1280

Let’s assume you’re playing at a casino with a maximum bet of $500. This means that if you lose 7 times in a row, the casino won’t let you place a large enough bet to continue your system.

How often does this happen?

More often than most gamblers think.

The odds of getting the same color 7 times in a row are easily calculated.

The odds of getting a red result are 18/38, or 47.36%.

47.36% X 47.36% X 47.36% X 47.36% X 47.36% X 47.36% X 47.36% = 0.534%.

That’s half a per cent, or about once every 200 times the progression is played.

In the short run, a gambler using the Martingale guarantees that he’ll never see anything but occasional small wins.

In the long run, a gambler using the Martingale will eventually run into a streak of bad luck which will wipe out all his winnings.

Blackjack is one of my favorite examples of probability in action. That’s because, unlike roulette, the deck of cards has a memory. Every time a card is dealt in a game of blackjack, the composition of the deck changes. This means the odds change.

This is why card counters are able to get an edge. They have a means of tracking these changes in the composition of the deck.

You’re playing in a single deck blackjack game, and you’ve noticed that all 4 aces have already been dealt. What is the probability that you’ll get a blackjack?

Since a blackjack is made up of an ace and a card valued at 10, and since there are no aces left in the deck, the probability of getting a blackjack is 0%.

Since a blackjack is made up of an ace and a card valued at 10, and since there are no aces left in the deck, the probability of getting a blackjack is 0%.

But you can calculate the probability of getting a blackjack from a fresh deck of cards, too.

You’re calculating the probability that you’ll get an ace on the first or the 2^{nd} card. There are 4 aces in the deck, so your probability is 4/52. That becomes 1/13 when you reduce it. Since this is an “or” question, we add those together:

1/13 + 1/13 = 2/13

But now you get to the “and” part of the problem. You need an ace AND a 10. There are 16 cards in the deck worth 10: the king, the queen, the jack, and the 10.

Your odds of getting a 10 for your other card are 16/51.

16/51 X 2/13 = 32/663. That equates to 4.83% of the time, which is about 5%.

5% is the same thing as once every 20 hands, so you have an approximate idea of how often you’ll get a blackjack.

One of the great things about video poker is that even though it looks like a slot machine, it’s NOT a slot machine.

Here’s why that’s great:

Slot machines are the only game in the casino that have opaque math behind them. In other words, you have no way of knowing what the odds and/or payouts are on a slots game.

Yes, slots have pay tables. So you know what the payoffs are for various combinations of symbols on the reels.

But you have no way of knowing what the probability of getting a particular symbol on a reel is.

If you DID know this, you could calculate the probability, the house edge, and the payback percentage.

Here’s an example:

You’re playing a game with lemons as one of the primary symbols. The probability of getting a lemon on each reel is 1/10. If you get 3 lemons, you win 900 coins.

The probability of getting 3 lemons is 1/10 X 1/10 X 1/10, or 1/1000. In odds terms, that’s 999 to 1.

(Remember, to calculate probability when the question includes the word “and”, you multiply. Here, you want to know the probability of getting a lemon on line 1 AND on line 2 AND on line 3.)

Let’s assume this is a high roller game and that you’re betting $100 per spin.

You make 1000 spins. On 999 of those spins, you lose $100, for a total loss of $99,900. On 1 of those spins, you win 900 coins, or $90,000. Your net loss is $99,900 – $90,000, or $9900.

Average that out over 1000 spins, and you’ve lost an average of $9.90 per spin.

So the house edge on this game is 9.9%. The payback percentage is what casino people look at when dealing with gambling machines, though. That’s just 100% less the house edge—in this case, 90.1%. It represents the amount of each bet that the casino gives back to the player, rather than the amount it gets to keep. It’s also sometimes referred to as return to player.

The casinos and slot machine designers know these odds. They designed the game. But since the games are powered by a random number generator, they’re the only people who know these odds. If the game were mechanical, you’d be able to calculate the odds just by knowing the number of symbols on each reel, because each symbol would have an equal probability of turning up.

But a lemon might be programmed to come up once every 12 spins, or once every 15 spins, or once every 20 spins. There’s just no way to know.

Video poker, on the other hand, duplicates the probabilities you’d find in a deck of cards. Not only do you know what each combination of cards pays off at. You also know the probability of getting each combination.

You can use that information to calculate the house edge and the payback percentage for the game.

We know that in a standard Jacks or Better game with a 9/6 pay table that a pair of jacks or higher pays off at even money. You’ll see that hand about 21.5% of the time. So the expected value (the payback percentage) for that particular hand is 1 X 21.5%.

You can do that same calculation for every possible hand. You multiply the amount you stand to win by the probability of getting the hand. Then you add up all the possibilities to get the overall payback percentage for the game.

And the great thing about video poker is that the payback percentages are almost always higher than for slot machines.

The other great thing about video poker is that you have a certain amount of intellectual stimulation that isn’t present during a slot machine. You have to decide how you play each hand. Playing them correctly increases your chances of winning.

The expected return of a bet is how much you expect that bet to be worth. If it’s worth more than you’re risking, then it’s a positive expectation bet. Otherwise, it’s a negative expectation bet. Gamblers sometimes use +EV or -EV as a shorthand for this.

Here’s how you calculate what a bet is worth:

You take the probability of losing and multiply it by the amount you’ll lose. Then you take the probability of winning and multiply it by the amount you’ll win. You subtract one from the other, and you have your expected return.

It’s that simple.

Let’s use roulette as an example (again).

You place a $100 bet on a single number. If you win, you get $3500. If you lose, you lose $100.

Sounds like a pretty good deal at first glance, doesn’t it?

But let’s do the math.

There are 38 numbers on the roulette wheel, so the probability of winning your bet is 1/38. The probability of losing your bet is 37/38.

1/38 X $3500 = $92.11

37/38 X -$100 = -$97.37

The difference is $5.26. And since the negative number is greater, that’s the amount you expect to lose. The expected value of that bet is -$5.26.

You can use this kind of calculation in life, too. I’ll talk about that more later.

The trick to inventing a casino game is to come up with something that looks like a fair bet but isn’t. If you’re too obvious about putting the odds in favor of the casino, no one will want to play.

Let’s use a bet on a coin toss as an example.

Suppose you run a casino, and you offer to pay 25 cents to anyone who correctly guesses the correct result on a coin toss. Here’s how you get your edge—you make the player put up 50 cents in order to win that quarter.

The casino has an obvious large edge in that situation, even if you don’t know how to do the math. But we do know how to do the math, so let’s look at it:

You have a 50% chance of winning 25 cents.

You also have a 50% chance of losing 50 cents.

50% of 25 cents = 12.5 cents

50% of -50 cents = -25 cents

Your expected value is -12.5 cents. Your expected loss on every bet is 12.5 cents. Since you’re betting 50 cents on every bet, the house has an edge of 25%.

That’s huge.

But let’s see if we can make the game a little more interesting.

Let’s say you have to bet a dollar, minimum.

This time, you toss a coin and so does the casino. If one of you gets heads and the other one gets tails, the party with heads is the winner. The party with tails is the loser.

And let’s say this game offers an even money payout.

Looks like a break even game, doesn’t it?

It is, but what do we do in the event of a tie?

What if you both get heads?

What if you both get tails?

If you want to give the casino an edge, you just add a rule that if you both get heads, the game is a push. Your bet is returned, but you don’t win anything.

But if you both get tails, the casino still wins.

What kind of house edge would this game have?

Let’s look at the probabilities involved:

You have the following potential outcomes, all of which are equally likely:

- You both get heads.
- You both get tails.
- You get heads, and the house gets tails.
- The house gets heads, and you get tails.

So you have a 25% chance of a push, which has a net effect of 0. (You don’t win anything, but you don’t lose anything, either.)

You have a 25% chance of winning a dollar, which is an expected value of 25 cents.

You have a 25% chance of losing a dollar, which is an expected value of -25 cents.

And you have a 25% chance of losing a dollar, which is an expected value of -25 cents.

The house still has a 25% edge, but it isn’t nearly as obvious now, is it?

The house could make the game even more fair by returning half your bet in the case of both of you getting tails. That would reduce the house edge to 12.5%, which is better but not great.

Another option would be to take any tie result and have 2 options:

- You could put up another dollar in order to get another coin toss.
- You could just surrender your bet.

The catch is if you put up the other dollar and win, you only get $1 in winnings. But if the casino wins, you lose both dollars.

And in fact, this is almost exactly how casino war works.

That’s just a simplified version of how to create a new casino game. You could have multiple bets, multiple payouts, and multiple probabilities for each outcome. You just have to look at the expected value of all of them to determine how much of an edge the house.

Then you have to test the game in front of a live gambling audience to see how they respond. If the gamblers don’t like the game and refuse to play, you don’t have a potential casino game at all.

The house wins nothing if no one likes the game.

This is the most important section of the post, actually. You can use your knowledge of probability to make decisions in real life that are better than the decisions of most people.

You work at a job making $50,000 a year. You decide to take a job making $200,000 a year, even though you estimate that there’s a 25% chance of the company paying you $200,000 a year going bankrupt after your first year of employment.

The expected value of that decision is simple enough.

The expected value of that decision is simple enough.

But let’s look at a more day-to-day practical application. Suppose you’re considering parking in a handicapped spot.

The probability of getting a ticket for this might be 20%, and the parking ticket might be $200. That’s an expected loss of $40.

But parking in that spot will save you half an hour of walking, and you’re a highly paid consultant earning $250 per hour. That means your time is worth $125 per half hour.

Since your time is worth well over $40, it makes financial sense to take the risk.

Of course, that example ignores any moral implications involved. You might have a moral problem with taking up a parking spot intended for a handicapped person.

Here’s another practical example:

You’re obese, and you’re 45 years old. According to the doctor, your life expectancy if you don’t lose weight is 60 years old. But if you have weight loss surgery, your life expectancy becomes 68 years old.

But you also have a 1 in 800 chance of dying during the weight loss surgery.

What’s the correct decision, mathematically?

Let’s assume that the life expectancy numbers are absolute certainties, because that makes the math easier to do.

You have a 100% chance of dying 15 years earlier if you don’t use the surgery. That’s -15 years.

You have a 799/800 chance of gaining 18 years if you have the surgery. That’s an expected gain of +17.98 years.

You also have a 1/800 chance of dying 30 years earlier (at age 45). That’s another -0.04 years in expected loss.

17.98 – 15 – 0.04 = 2.94 years of expected additional life.

And chances are, you’ll enjoy life more during that time. You’ll be able to engage in more activities, find more people to date, and require less medical attention.

The net positive outweighs the negative, even though the calculations seems pretty close.

You can make the math even more complicated by factoring in the possibility that you could lose the weight without surgery. But we don’t need to get into that much detail; we just wanted to look at how this type of thinking (probability based) can be applied to real life issues to make better decisions.)

You can find an almost unlimited number of gambling probability examples to discuss. But all of them start with the notion that probability is always a number between 0 and 1. The other thing to remember in probability problems is the difference between “or” and “and”. If you want to know the probability of this happening OR that happening, you add the probabilities of each together. If you want to know the probability of this happening AND that happening, you multiply the probabilities by each other.

You can multiply the probability of winning by the amount you win and compare it with the probability of losing and the amount you’ll lose to get the expected value of any bet. In casino games, the edge is always with the house, although the way they present the games is subtle.

It’s not always clearly obvious how the casino gets it mathematical edge over the player. But it’s always there, unless you’re a card counter or expert video poker player.

0 Comments

See All Comments

Leave Your Comment