Addendum: I got the number from Prime Numbers by David Wells. I wrote a little Python script that did the Rabin-Miller primality test, and first verified that the number specified above was (at least very likely to be) prime, and then generated all similar numbers up to 500, and found they were all composite.
Addendum2: In 1964, Gillies discovered the 21st, 22nd and 23rd Mersenne primes using the Illiac II supercomputer. It took them 2 hours and 15 minutes to verify that 2**11213-1 is prime. I implemented the Lucas Lehmer test in Python, and verified the result in 40 seconds on my AMD64. I love Moore’s Law.
My daily sail upon the seas of the Internet blew me ashore on An Introduction to Scheme and its Implementation, a book on implementing Scheme by Paul Wilson. Scheme has a bit of nostalgia for me, having taken a class in Scheme from Will Clinger back in my graduate days at the University of Oregon.
C’mon folks. It’s 2007 for pete’s sake. Why are there still programs which can’t be built on 64 bit machines? Yes, it can be a pain. When we ported the RenderMan with all its mumble mumble lines of code to the DEC Alpha, it was tiring, gritty, annoying, but absolutely necessary. And in the grand scheme of things, it’s not really that hard.
The reason that this is difficult (or can be difficult) is that often a glass of liquid is modelled as two objects (the glass and the liquid) with different refractive indices, and with surfaces that are largely coincident. This avoids the coicident surface problem by using a technique reminiscent of a scheme that I used to implement constructive solid geometry efficiently in a raytracer.
Anyway, some good ideas, I thought worth mentioning.
Over twenty years ago, I first read Structure and Interpretation of Computer Programs, which I still consider to be the most amazing computer science textbook ever written. I’ve known for a while that one of the author’s, Gerald Sussman has written a book on classical mechanics called (appropriately enough) Structure and Interpretation of Classical Mechanics. What’s cool is that the entire text is online. I got reminded of this while watching this lecture by Sussman on the occasion of Dan Friedman’s 60th Birthday.
I must admit that most of the physics (and a great deal of the math) is over my head, but basically he’s talking about the systematic representation of knowledge (in this case, physics knowledge) as programs, and how math becomes more tractable and interesting when programming is made part of the process.
I had the vaguest recollection of a SIGGRAPH sketch on creating an armature wired with potentiometers for creating an inexpensive motion capture setup. A bit of digging revealed that this done by Stefan Gustavson in 2002, and Tom was kind enough to run down this link to his page which somehow eluded by Google-fu.
I’ve worked on computer graphics for over twenty years now. Through some quirk of fate, I have spent nearly all of my time programming batch rendering algorithms that take minutes or even hours (days?) to render pictures. The last time I really did interactive work was before OpenGL even had the “Open” prefix, back when I was working in the Princeton Applied Math department back in 1990. So, I’m not exactly the most experienced OpenGL programmer. I haven’t written anything more complicated than simple 2D plotting in years.
So, I was working on my program that reads the accelerometers from the Wii Remote tonight, and tried to set it up to draw an airplane that I could tip and tilt using the Wii remote. Unfortunately, I couldn’t get the lighting to look right. I tried all sorts of different things, but finally decided to do a bit of research and found this page:
What bit me? Why, the very first thing: glEnable(GL_NORMALIZE);. It never dawned on me that OpenGL might fail to compensate for scaling before computing lighting. Oh well. Live and learn.
I’ll have some example code and video of this little hack this weekend. I want to clean it up a bit more, and I still need to shoot some video of it.
An interesting aside bit of personal trivia: the model that I am using is a model of the NASA’s X29 experimental aircraft. It’s the same polygonal model that I first rendered with my old raytracer over twenty years ago. I feel like I’ve come full circle.
It’s actually not just hard in the sense that you normally think of as hard. You might think that if you worked hard, you could figure out what programs actually do. Ed Felton shows that this simply isn’t true. In his simple Python scripts, he asks a question: “is there any combination of inputs to make the program print the word ‘Drat’?” Here’s the program:
import sys, sha
h = sha.new(sha.new(sys.argv).digest()[:9]).digest()
It usually surprises people who don’t have computer science degrees that there is no practical way to tell. Alan Turing formalized this idea: it’s basically impossible to provide an algorithm to determine any non-trivial property of programs (such as whether they halt). Any legislation which requires somebody to evaluate the complete behavior of any program is not just absurd and impractical, but simply not possible.
Addendum: The statement above is actually subtly incorrect (I should have referred back to my old theory of computation textbooks). I was trying for a description of Rice’s theorem. The idea here is that there is no algorithm (equivalently, you can’t build a Turing machine) to decide with generality whether another Turing machine (for example) accepts a particular string. Or halts. Or halts on an empty tape. Or, in fact any non trivial property.
Using Rogers’ characterization of acceptable programming systems, this result may essentially be generalized to most computer programming languages: there exists no automatic method that decides with generality non-trivial questions on the black-box behavior of computer programs. This is one explanation of the difficulty of debugging.
Years ago, I read Gleick’s Chaos, and my recent diversion into calculating various numbers to high degrees of precision made me think about Feigenbaum’s Constant. While I didn’t write a program to compute this constant, I did recall how it is defined, on the basis of this logistic map:
Feigenbaum’s constant is the limiting ratio between successive bifurcation intervals of the logistic map (as well as all other one dimensional maps with a single hump). You can read more here.
I’m no expert on this stuff, it was just a momentary coffee break diversion.
In between sessions of zelda (currently in the Lake Temple, total elapsed time just over fourteen hours), I’ve been considering just how cool the Wii Remote controllers are. For $40 (admittedly, quite a bit for a controller) you get a controller with the following nifty features:
It communicates wirelessly via Bluetooth, which opens up a whole bunch of possibilities for using them on other machines.
They include a fair supply of buttons, and a “rumble pack” force feedback device.
You can play sound back through the remote.
It’s got four LEDs that you have individual control over.
It has a three axis, 3.6g accelerometer, so you can read acceleration and orientation from the remote.
It has an imaging infrared sensor that can be used to determine your orientation compared to the Wii sensor bar.
In short, it’s just a damned cool little device all to itself. I was surfing and discovered wiili.org, a site which is all about wii hacking, and they include links to drivers and libraries for reading the Wii controller on different devices. A download of libcwiimote later, and a bit of programming, and I was reading acceleration data from the remote. Behold the following graph plotted with gnuplot:
Near the beginning, I am holding the remote fairly steady. The red, green and blue lines measure the acceleration in x, y, and z, respectively. Notice that the blue line is offset. This is because of the acceleration due to gravity. Later in the test, I whirled the remote around in circles, which you can see by the rather chaotic motion.
I’m currently working on a little interactive display using OpenGL. More later.
It contains a lot of howlers, but I thought I’d bring your attention to a few of them.
Macrovision admonishes Steve by saying that DRM “is broader than just music”. Indeed? Do you think that Jobs, the single largest stockholder in Disney, doesn’t realize that?
Macrovision claims that DRM increases consumer value. Oh really? If I buy a song on iTunes, I can transfer it to only a limited number of devices. If I pirate it, I can copy it to any device I own. How is my consumer value enhanced? Macrovision says “consumers who want to consume content on only a single device can pay less than those who want to use it across all of their entertainment areas â€“ vacation homes, cars, different devices and remotely”. What they are arguing is that they want you to pay for the same content multiple times. Since providing those bits to you is very cheap for them, I can understand why they might like this business model, but it’s simply astounding to say that it adds customer value.
There is also a certain amount of irony in the claim that DRM needs to be interoperable and open. Macrovision is of course has made millions by licensing their exclusive technology. I’m sure they’d be happy to help Steve out by licensing their technology to him, whether it is actually effective or not.
The short summary: the regular expression matching in common scripting languages like Perl and Python are many orders of magnitude slower than they need to be for fairly large classes of potential regular expressions. I guess I find this a little startling: the topic is pretty well understood, and has been so for roughly forty years. Ken Thompson wrote a paper in 1968 on the topic, a straightforward implementation of the ideas contained within runs a heck of a lot faster.
Interestingly enough, I learned how this stuff worked way back in my third year of college as an undergraduate. I took a compiler course, where we learned all about Thompson’s algorithm and how to convert regular expressions into non-deterministic finite automata, and then how to compile them into deterministic automata. This algorithm formed the basis of the lexer generator in an undergraduate compiler course, and was not difficult to write (or even understand). I’m left wondering: what are they teaching kids in Computer Science these days? Oy.
Addendum: During the same time period, I had a personal epiphany. At the time we were using a Vax 11/750, and a fast version of fgrep, called match, written by Peter Bain came across the old mod.sources newsgroup. It used the dedicated MATCHC instruction of the VAX to search for matching strings. The funny thing was, soon after a whole bunch of even faster implementations came down the pike. It taught me that being smart was often a better thing than having better hardware.