Even bigger primes…

Last year, I wrote a post about a program I wrote to compute a really large prime number. . At the time, 232482657-1 was “king” of the primes, having over 9.8 million digits. Today, one year later, it turns out that the Great Internet Mersenne Prime Search has discovered two even larger primes. I dusted off my code, and was pleased to find that it still worked, generating the nearly 13 million digits of the new king of primes in roughly two minutes…

Here are the digits of 243112609-1:

432597 lines deleted...

Addendum: Okay, it dawns on me that I never explained why this is difficult. Here’s the thing: Mersenne primes are really easy to write in binary: they are just a long string on ones. But to write them out in base 10 like we all know and love is more difficult. To do it, I actually do all my arithmetic in base 1000 (a convenient choice, because we want to print out the number with commas in place). Computing the prime isn’t very hard really, ab is 1 if b equals 0, is a * ab-1 if b is odd, and is ab/2 squared if b is even. Then, you only need to do log2(b) multiplies to get your (in this case very big) number. The real trick comes in doing the multiply: the obvious algorithm takes O(n2) time to multiply two numbers. But a faster way is to use the FFT. The trick is realizing that multiplying numbers using the old scheme is basically a lot like convolution. Convolution can be changed to simple multiplication by using the Fast Fourier Transform. So, here’s the basic idea. Suppose you have a number represented in base 1000. Pad it with an equal number of zeros. Then take the Fast Fourier Transform. Square each resulting bin. Then inverse FFT it. You’ll be left with some bins with values > 1000. But you can normalize those by dealing with the carries. And you are done! The FFT only takes O(nlogn) time, the normalization is linear, so you are much, much faster in the limit doing multiplies this way.

There are a few issues with precision but overall, it works quite well.