Every day thousands of people travel to resort destinations like Las Vegas with dreams of coming away richer. Closer to home, everyday you’ll see people standing in line at the local convenience store to play the lottery. They play the stock market, kick a couple of bucks in the office football pool, play poker on-line and meet with friends on the weekend for that “friendly” game of gin rummy. Why do people invest in these “chance” opportunities? Because they believe they can beat the odds. They believe in the possibility of winning.
A little basic math can tell you whether or not you’re likely to win. It can show you how often you’re likely to win and, if applied over the long run can even do a fairly accurate job of predicting how much you’ll win . . . or lose. Those who can determine the probability of a certain event occurring – such as winning the lottery – can make better choices about whether to risk the odds.
How do you determine probability? Let’s say there are 12 socks in your dresser drawer. Five are red and seven are blue. If you were to close your eyes, reach into the drawer, and pull out one sock, what is the probability that it would be a red sock? Five of the 12 socks are red, so your chances of picking a red sock are 5 out of 12. You can set this up as a fraction or convert it to a percentage that expresses the probability. The fraction is 5/12. Your chances of picking a red sock are 5 divided by 12, which is about 42%. Not bad, as odds go.
Imagine you’re choosing between two gaming destinations. One is in Nevada and the other is in Mississippi. You decide to flip a coin. Heads and you’ll go to Reno, tails and you’ll go to Biloxi. When you toss the coin, what is the probability that the head side of the coin will be facing up once the coin hits the floor? This ain’t rocket science. There are two sides to a coin, and one of them is heads, so your odds are one out of two. There’s a 50% chance you’ll go to Nevada, and a 50% chance you’ll go to Biloxi. It’s really not that complicated.
Let’s start with some basic rules about probability and we’ll see if we can make a bit more sense of all this.
Rule One: The probability of an event happening is always either 0 or 1. That means it will happen or it won’t. But that doesn’t mean there’s always a 50/50 chance of a given result.
Rule Two: The probability of an event occurring plus the probability of that event NOT occurring always equals one. Let’s say there’s a 68% chance of an event happening. Conversely, there’s a 32% chance of it not happening. Again, not rocket science.
Rule Three: For mutually exclusive events the probability of at least one of these events occurring equals the sum of their individual probabilities. Okay, let’s think about the come out roll. You’ve all heard that there are eight ways to win on the come out versus four ways to lose. What does that mean? On the Do side you add the six ways to roll the seven to the two ways to roll the eleven to come up with eight ways to win. On the Don’t side you add one way each to make the two and twelve to two ways to make the three and end up with four. Eight to four odds in favor of a come out win. Or you could reduce that down and call it two to one.
Rule Four: The Multiplication Rule as applied to independent events. While rule three gives us the probability of one of several possible events occurring, the multiplication rule is used to find the probability of all them occurring. Let’s go back to the old standard coin flip example to work through this one. The odds of a coin flip resulting in “heads” is 50/50 or one in two. To calculate the odds of heads appearing five tosses in a row you must multiply ½ x ½ x ½ x ½ x ½. The answer is 1/32, or around .031. Now let’s translate that to a dice toss. I’ve tossed five consecutive twelves a couple of times in the past. What are the odds? Let’s change the appearance of the formula up a bit for this one. It’s (1/36)5. That’s 1/36 multiplied times itself five times. If your pocket calculator has enough digits it will yield a probability of around 1 in 60,466,176. Hey – that’s a big number.
Rule Five: The Multiplication Rule as applied to dependent events. This one is similar to Rule Four – however in this case we’re using events that are dependent on one another. To stick with a gaming theme let’s just take a deck of cards and attempt to draw three consecutive aces. The probability that you’ll draw an ace on the first pull is 4 in 52. There are 4 aces and 52 cards. But after you draw that first card everything changes. The odds of the second card being an ace may be 4 in 51 – or 3 in 51 – depending on the results of your first draw. If you wanted to calculate the odds of drawing three consecutive aces you’d express the formula like this. 4/52 x 3/51 x 2/50. Do the math and you’ll come up with a probability of around 1 in 5525.
Of course, if you are like most of us then when you have a probability question about craps you simply noodle around in the Wizard of Odds website and look for the odds charts. Why run through all of this math if someone else has done it all already. Why? Because you need to know a little statistics 101 if you’re going to be able to calculate you EV based on the distribution of numbers you throw. Is that a difficult calculation? No, not really. Here’s the formula. Hopefully the function symbols will print in Bulletin Board format:
EV = ∑(Net Payi x Pi)
Okay, don’t let that ∑ thing scare you. That’s simply a mathematical symbol that stands for the “sum of.” It means Expected Value is equal to the sum of the payoff on an event multiplied times the probability that the event will occur. Let’s go step up to a roulette table and play with the math a bit.
We’ll assume a winning number on a single zero roulette wheel. A $5 het pays 35 to 1, or $175. There are 38 slots on the wheel and only one of them can win. So the odds of a winner are 1/38 while the odds of a loser are 37/38. Now let’s figure out what our EV is.
EV = (+$175)(1/38) plus (-$5)(37/38) = (+$4.60) plus (-$4.86) = -$0.26.
These numbers mean that you can expect to lose a little over a quarter on every spin of the wheel over the long run. And, of course, the inverse is true. The casino can expect to win about a quarter off every $5 bet when you play the wheel.
Now let’s assume we have a biased wheel and the number you favor shows up 3 times in every 38 spins instead of just once. How does that impact the EV?
EV = (+$175)(3/38) plus (-$5)(35/38) = (+$13.82) plus (-$4.34) = +$9.46.
And now you can see why so many system players spend endless hours tracking rolls at the roulette wheel – a game with one of the higher vigs in the house. A relatively small variance in results can result in a huge advantage for the players.
Now let’s take that same formula and apply it to a $12 place bet on the six or eight in craps:
EV = (+$14)(5/36) plus (-$12)(6/36) = (+$1.94) plus (-$2.00) = -$0.06.
In other words, it cost you roughly six cents to make that bet while the casino won six cents every time you made it.
Now let’s consider a shooter who can routinely toss a hand like the one I threw on Saturday morning at Binions during GAC. The hand stretched to the mid-30’s and included ten eights. Let’s assume we had exactly 36 rolls on that hand. We’ll also assume the shooter averages five sevens per 36 rolls instead of six. Here’s how the numbers spill out:
EV = ($14)(10/36) plus (-$12)(5/36) = (+$3.89) plus (-$1.67) = +$2.62
Under these conditions you would make $2.62 every time you made this bet – versus losing $.06 cents in a random game. And that, my friends is important information to have.
How should you use this calculation? Here’s an idea. How about looking at your long-run BoneTracker results, plugging in some values and determining just what your EV is on the bets you normally make? Do that and you’ll be well on your way to becoming a true advantage player.