I picked up a new book on my trip to Reno: Extra Stuff: Gambling Ramblings by Peter Griffin. Griffin is the author of one of my favorite books in my collection of books on gambling topics: *The Theory of Blackjack*. This book includes all sorts of interesting tidbits of gambling theory.

The book had a particularly interesting and surprising discussion on the Kelly Criterion: a method of wagering that ensures the quickest maximization of bankroll when you have positive expectation in a game. Basically, if you have a probability p > 0.5, you maximize your bankroll when you wager a fraction of your bankroll equal to 2 * p – 1.

Griffin asked an interesting question: what is the probability at any step that you actually have reached the highest bankroll that you’ve ever seen in that step. When the bets are unit sized, you can derive rather simply (and prove via simulation) that the odds are 2 * p – 1 (interestingly, the same fraction used by the Kelly Criterion) that you have reached your peak earnings. But if you try to graph the resulting curve when you use proportional Kelly style bets, you get a function which is not only fairly complicated, but is in fact *discontinuous*. This seemed very unintuitive to me, so I wrote a simple program to duplicate the result and plotted it with gnuplot. For each probability p, I simulated one million wagers, and counted the number of times that I reached a new maximum.

Check out the graph:

The discontinuities are real, and the discussion is quite illuminating.

Addendum: The discontinuities occur because of the following. Imagine that you are at an all time high, and then suffer a loss, then a win. When you lose, your bankroll is multiplied by 1-f, and when you win it is multiplied by 1+f. Taken together, you get 1 – f^{2}, which is always less than one, so you know that after all possible sequences of length two that ends in a win (you need to minimally end with a win to reach a peak) you can’t reach a peak.

How about length three? Well, let’s try a loss followed by two wins. You have (1-f) (1+f)^{2}, which you want to be one (or higher). Solving this, we get 1 + f – f^{2} – f^{3} = 1, which means f – f^{2}-f^{3} = 0, or 1 – f – f^{2} = 0. Solving using the quadratic formula, we find that f yields a value of one precisely at (sqrt (5) – 1)/2, a number commonly referred to as the golden mean or phi. Sure enough, our graph displays a discontinuity there. At just below this value, a loss followed by two wins is insufficient to generate a new high, but at just over this value, it is. Since the probability of these particular sequences varies only infinitesmally, we see a strong discontunity in the chances of reaching a new high when f varies in this neighborhood.

Other possible sequences (two losses followed by three wins, for example) also generate similar but smaller discontinuities.

Very interesting.

At least to me.

But I’m a geek.

Addendum^{2}: For fun, try reading Kelly’s Original Paper and figure out what it says about gambling.

Looking at the Kelly paper, I noticed that it was published in the Bell Systems Technical Journal. I don’t remember ever hearing about him when I was at The Labs, but a little googling found some other John L Kelly, Jr. references from which I can piece together some history. He looks to have been a really interesting person. He died around 1970 and apparently worked at Bell Labs until the end — there’s a posthumous echo-canceller patent assigned to AT&T. (Kelly’s co-inventor was Ben Logan, who in addition to being the world’s leading expert on high-pass functions, was, as Tex Logan, a world-class fiddler, having played in Bill Monroe’s Bluegrass Boys in the 1950s.) He wrote a bunch of signal processing and computer science papers, including one (about a programming language for signal processing called BLODI [for BLOck DIagrams]) that I have a copy of in my office. He supervised Elwyn Berklekamp’s 1963 Masters thesis about a program for optimally playing bridge hands double-dummy (i.e. with all 52 cards visible.) William Poundstone’s most recent book is about the history of mathematicians going after the stock market — his point of departure is Kelly and Claude Shannon’s work in the 1950’s on gambling & signal processing. These are all guys (Berlekamp, Shannon, Logan, Poundstone) I know about for other reasons, and Kelly is connected to all of them. I suspect there’s an interesting mathematician biography to be written here…

If you haven’t already, Ben Mezrich’s book Bringing Down the House is fascinating tale of what happened when a group of mathematically sophisticated kids applied techniques like these, with some intriguing modifications and perhaps not so surprising results, at real casinos. I was at MIT during this time, and sort of vaguely aware of it, perhaps one degree of Kevin Bacon from the people involved, but thought they were just hyping themselves. The book changed my take on it.

p > 0.5… Where do you get a game like that?.

Assuming you could find a game with a p of 0.5, like flipping a coin.

2 * .5 -1 = ZERO

Zero, a good wager.

😀

Editor’s note: Well, while casino games aren’t particularly good at giving you such opportunities, other opportunities such as sports betting, horse racing, or (with a lot of complication) the stock market can and often do have positive expectation. As was pointed out to me by Tom Duff, Ed Thorp wrote the original blackjack card counting bookBeat the Dealer, and then went off and made gajillions in the stock market, essentially inventing hedge funds. But even if this didn’t have any practical application, it’s still interesting stuff. 🙂Many stock trades (esp in hedge funds) will look at the collective time series in a way that is essentially the same as the analysis of Brownian motion and showing the probability of the next move being an upward or downward move. Taking that probability and then applying Kelly’s (2P-1)B where B is your bankroll then gives maximum gains.

The problem is that since so many large funds use that (combined with the theory of the Gambler’s Ruin), it loses its effectiveness and you therefore need to add in extra layers or tweaks in your analysis. That is where the big money is made these days – tweaks on that basic concept – if you are doing time series analysis.

Two interesting things to note with that:

1) if you had just a P value of .60 and could sustain that for a few years (I haven’t run the equation out in awhile, so I don’t recall the exact time period, but I believe it is either 3 or 7 years), then you would have literally all of the money in the world.

So needless to say, it is rare in the stock market to see values as high as .60. In the short term you do, but then they are corrected out again as people act on that information and exploit it.

Normally though you see values in the .51-.56 range.

2) this type of analysis is technically a fractal and you can apply the concept to tick data, or every hour, every day, month, year, however you like (although once you look at a period longer than every day, you don’t really have enough data for it to work).

It also applies to many other areas and is generally tied in to any system with human involvement – many argue largely based on the way they move in crowds (look into the El Farol Problem).

I am a programmer for a hedge fund admin company and am trying to start my own hedge fund, so this area is particularly of interest to me.

My code shows excellent profits if you have $50K, and then as with any large system, will collapse (in terms of gains) as you reach larger bankrolls (around $100-500M). Part of that is due to a Heisenberg sort of idea where the more money you are throwing at it, the more you rstar

American taxes negate a lot of the profit at the lower level ($50K), as do trading fees – hence why it is an attractive method for hedge funds (offshore and therefore the tax issue is not at play).

Although to be honest, I’d rather in the end just be a pro poker player 🙂

Love the site!

Oops – I trailed off there on the Heisenberg – was essentially going to note that the more money you throw at the trades, the more you start directly having an effect on the price and therefore changing the way your analysis outcome works.

With smaller dollar amounts you can move in and out with less effect and therefore retain success.

So happy to digest such a insightful post that does not depend on base posturing to get the point across. Thanks for an entertaining read.