Since the last time I chimed in with some really interesting observations related to expected values in simple betting situations and how our human expectations might differ I have spent a bit more time reading about this, to my great amusement.

Today, I'm going to share with you more coin toss tomfoolery I have come across so that we may come to better understand some simple concepts that people who like to enter into betting situations have to understand if they are not to lose their shirts, their valuable time or their lives chasing unrealistic gain.

The game will again be the standard coin toss using a fair coin. This time the payout will work as follows. If you flip heads you win $2. Do it again consecutively and you win $4. The third successful flip of heads will net you $8 in such a way that you are doubling your winnings with every additional lucky head. But, if you flip a tail the game ends and you keep what you have won up to that point. What should you be willing to pay to play this game?

If you remember from my previous blog entry mentionning that many people will pass up a sure thing because they do not fully understand the concept of expected value in the long run you will likely assume that just as many would not know exactly what they should be willing to pay to enter this game of chance we now propose. Since 5:1 payout odds were needed in the related examples to even get most to consider betting in the first place we would likely not see too many interested to get in for more than 40 or 50 cents were we to survey average Joes and Janes. However, for us smart folk who have learned what to look for we can calculate the long term expected value of this game offering 50/50 chances at every turn. It will be:

Ev=Sum for all events [probability of event x payout]

Ev=0.50 x $2 +0.25 x $4 + 0.125 x $8 + 0.0625 x $16+ ..... or more simplified as

Ev= $1 + $1 + $1 + $1 + ....

This is awesome as a result because the expectation of the payout goes to infinity as the number of games goes to infinity. The proper conclusion on the basis of Ev alone would be that there are no bad amounts to be paying for entry into this game. It's a sure thing if you just keep buying in. Name the price and you are in, right? Not so fast. While it is true that at some point on your journey of reentering an infinite number of times you are assured of pocketing a serious chunk of all the money in the Universe, what does it mean in the practical sense?.All you really need to do is hit one of those really long streaks of heads that virtually never ends and avoid many much smaller polar opposites. Oh, is that all? No, you'd likely need quite a big bankroll and a few lifetimes to spare (ok, maybe a really fast supercomputer instead). Even though it is is a sure thing and can't fail you will practically run out of money and/or time and that makes it a sure loss if you are chasing too big a gain.

This type of conundrum goes by the name of the St. Petersburg Paradox and it was the basis for what has been modeled to arrive to a more realistic interpretation of this problem. Obviously, we don't have infinite cash resources for unlimited buy-ins or an infinity of time at hand, so how do we figure out what we should be willing to pay to play? It turns out it has a lot to do with how much we value what we could possibly win in relation to what we risk or have to play with. 

Some pretty smart Mathematicians figured out that if you defined something called Eu, which they called Expected utility (as opposed to Expected value) it would yield something that would speak of the appeal the bet result might have to you in relation to where you already stand financially. In other words, if you are a millionaire and want to go to Vegas to play this game you can take into account your starting wealth and use it to come up with what would be a decent cost to potentially win something that would be deemed useful or of meaningful scale to a millionaire (as defined by you) if he you were to play this game a lot. The model is useful to approximate social behaviors of everyone, not just millionaires.

I won't bother to write out the elegant formula because it might overwhelm you with all its nice greek letters, but it is easily understood by describing what it does in the most basic sense (you can google it and look at it for a while if you wish). It assigns a logarithmic value to starting wealth which it calls a utility function (from the basic estimation that small amounts mean increasingly little as they shrink ) and it weighs the change of utility before and after the game in relation to the probabilities. The long term yield of this from the number of games you play tends not to infinity, but to a number that has more real meaning. That number would be the Expected utility payout.

Eu, when all the numbers are plugged in, yields conclusions like this: A millionaire should be willing to pay no more than $10.94 a shot at this game, someone with $1000 should be willing to pay no more than $5.94, someone with 2$ should be willing to pay up to 2$ and someone with $0.80 should look to borrow $0.87 and pay up to $1.67. From a risk weighed to result  perspective each would be equally well served. The millionaire will lose often chasing meaningful gain, but his bankroll will still allow him to hit the necessary long streaks that might give that to him. The guy who is going into it once has one shot (hello high variance!), but he is just as likely to get a meaningful win out of that one. How about letting a millionaire play for $2.50. He' ll do no better assuming he's still aiming for the same "useful" payout. He might get ahead and likely not be satisfied and keep playing until he's broke or has reached a meaninful (for him) amount of winnings. He should never go into anyone's game for more than $10.94, though (based on this model). At that point it is deemed statistically unlikely to produce any useful gain in the long run. It is in your interest to charge him as much as possible. It's why we often hear that markets are always gaguing our ability to pay more. If you ran a game (or anything approximated as one) and it was profitable, and the clientele were still willing to pay more, then it's useful for you to know this.

The very interesting thing for me is that this "utility" approach to solving the practical dilemna of what to pay (or bet) in this paradox yields something exactly akin to a bankroll mangement protocol in poker. In the same way a $1000 poker bankroll holder does not push his luck by playing too many $50 MTTs a coin tosser playing this game is wise to know what is too much to pay for his stash size. If you are wondering how bankroll management figures are arrived at for poker this is an example of how the use of Expected utility can produce the numbers we are often just shown in tables.

I have ended up with a better foundational knowledge of what has always been said about poker and bankrolls--it is your bankrioll management that will yield the opportunity if you are playing the right stakes. If you don't respect it chance and its lovely black swans are likely, but not assuredly,  going to rob you blind in the long run. You' re going to be just as likely to turn a long term meaninful profit as the millionaires who risked more than $10.94 a pop playing our game. The long term would yield winnings only to the house...and collateral losses mount along the way (time, outlook on life, hope...).

So, not only do we need to know better our own risk aversion, our opponents' risk aversion and the Ev of what we do (my previous blog's lessons) we need to be realistic about our financial standing (not act like millionaires when we are paupers) and what type of gain we should expect from our bets in relation to our bankroll requirements. I can see how very soon I might be able to figure out exactly where I might be going and at what speed I might be going there. Sharpen your pencils folks; we' ve got some calculating to do.

To those who feel they are best served by playing from the seat of their pants and with gut instincts, good luck.

Cleobuddy