As I was driving in this morning, I entertained a train of thought that led me back to thinking about peer to peer networks. I recalled that I had seen a posting by Ed Felton some years ago about implementing a peer-to-peer networking scheme in a very few lines of Python. A few minutes of searching once I reached my desk reminded me that the name of this was TinyPTP, and that it was 15 lines of dense Python. A search of Ed's Freedom To Tinker Website revealed nothing but dead links, but a few minutes with the Internet Wayback Machine resurrected it for our consumption:
I recall that I tried to run this before, and had some difficulty, but haven't tried again. Still, the basic idea is pretty interesting: you create a network of XMLRPC servers, whose basic functionality is to ensure that all network nodes have copies of all files. It uses the SimpleXMLRPCServer classes and some overly terse idioms to accomplish its task.
Here's the thinking that leads me to here: I've been listening to a lot of stuff about the scalability of Facebook lately. Ironically, the scalability doesn't contribute to user experience directly: no user uses very much of any of the Facebook resources. Facebook's 60,000+ servers serve something like 200M+ users every day, so obviously each user is getting a very tiny fraction of any single server machine, a fraction so small that it could be easily subsumed by even the most tiny computing appliance. It is only the centralization which causes the need for scalability.
And, of course, a distributed architecture has the possibility of fixing a few of Facebook's other ills, such as allowing more direct control over privacy.
So, here's the idea: use ideas similar to the TinyPTP client to implement a distributed social network like Facebook. Implement it in a simple language like Python, and in a straightforward way (don't be overly terse like TinyPTP). Pay attention to security, but also make it simple enough so that people can adopt it by downloading a single Python file and running it in a directory of their choosing on their machine at home.
It's just a thought experiment at the moment. We'll see if it gains traction in my head.
Jared Newman of PC World thinks that Apple blew it with respect to the multitasking in the iPhone:
Here's the funny thing though: all these industry pundits keep ranting about how features of the iPhone or iPad are really terrible, and yet when polled, people who buy iPhones are overwhelmingly satisfied with their phone choice. Perhaps pundits should reconsider that they don't understand the phone market as well as Apple does.
Newman states (correctly) that multitasking is something that has to be explicitly coded into applications. And here's the odd thing: it would be better for the phone as a whole if most developers simply chose not to.
It's all about battery life. If your program really doesn't have anything to do when you aren't staring at the screen, the iPhone will nicely not let it do anything when you shift to another process. It basically shuts down your application, perhaps storing some information so that when you re-run the program, you can go right back to where you left. Many apps are kind of lazy, and just restart the program (often displaying time wasting banners) each time, which is one of the reasons that people claim they want multitasking. For 90% of these cases, just writing the applications better would be sufficient.
But I'll be the first to admit that there are processes which don't fit this model, and actually do require some CPU time while the user isn't staring at them. For instance, Google Latitude isn't really useful on the iPhone, because it was only a web application, and therefore was really only running when you were running Safari. This kept you from informing anyone of your position if you pretty much were doing anything else. You'd like to keep your music playing. Perhaps you'd like some long downloads to complete. All good stuff.
With IOS4, Apple has given you the capability to make that happen. And yes, as a programmer you have to be explicit about that. It's a teensy bit tedious, but frankly if you are going to make an effective app, you should really be on top of the resources that your application needs. In particular, you shouldn't be viewing your application as necessarily so important that cutting your clients battery life unnecessarily is a virtue.
Just a quick aside about the iPhone Task Manager: in most cases, you don't need to access it at all. Hitting the button in the task manager is just like hitting its icon on the main screen. If it is more convenient to find the application on any of your home screens, just do it that way. You will need to access it only to delete processes, and properly coded apps don't really need to be explicitly deleted anyway: if the iPhone needs to delete an application because it runs out of memory, it will do so in a well defined way that allows you to restart the application in a consistent state.
Is iPhone multitasking amazing? No, not really. But neither is it horrible. It's a reasonable set of design choices to ensure good user experience.
32 bit processors now cost $1. That means that you can build fairly competent video games for the price of a coffee. Check it out:
Addendum: The prototype is actually built on a small board that can be ordered from Digikey for about $30. The software is a teensy bit clever (maybe more than a teensy bit). It generates video entirely via interrupts, but the clever thing is that it uses the SPI interface on the chip in a way that allows you to generate the necessary chroma signals without overburdening the main cpu, which generates the luma part of the signal. Quite clever.
It's good to see that 16 year old Abby Sunderland has been rescued by a French fishing vessel. She was attempting to set a world record by sailing around the world solo.
A lot of people have been commenting about how reckless, negligent, etc.. her parents were for letting someone so young attempt such a dangerous voyage. The claim is that a sixteen year old is simply incapable of comprehending the risks and the dangers involved.
I understand this concern, but I don't share it.
Our society has become incapable of understanding risks. We tolerate the thousands who die in automobile accidents, yet when one person tries (and fails) to put a bomb in their shoes, forever after we all are forced to take off our shoes at airports. We let kids drive at 16, but they can't vote until their are 18, and we mostly hold off drinking until age 21. And at each age, there are kids who take these new possibilities and handle then responsibly, and there are those who never become responsible at any age.
Abby comes from a family full of experienced sailors. Her brother made a similar journey at age 17. While I do not fully comprehend the compulsion that would cause someone to embark upon such an adventure, neither is it completely mystifying to me. Such a endeavor cannot help but be profoundly empowering, pushing one's self to the limits of their endurance, resourcefulness and skill.
I'm perfectly willing to accept that Abby and her parents are better judges of what she is capable of doing than we are. By all accounts, she was well equipped, well trained, and if you take the time to read her blog, emotionally mature enough to handle a journey which would certainly cause me to question my own training and maturity at age 46.
To Abby: condolences on the loss of Wild Eyes, but it is good to hear that you are safe. Even though your record setting attempt has ended, I am sure that further adventures await you, no matter what directions you choose for yourself.
Bon voyage, and God's speed.
Lots of amateur astronomers use Registax to do what is known as "Lucky Imaging". The idea is that you record a bunch of video frames, and hope that you capture some moments of good seeing, which you then carefully align and average to remove noise, and then enhance. I was playing around with a sequence of 1200 frames that I found online, and in my first attempt of using this program, came up with this image of Saturn. Not bad. I keep thinking I should do some video astronomy, but never quite get around to it. Perhaps that should change.
I am always on the lookout for people who build interesting computers from scratch. Here's another nifty one: a 4 bit CPU called the Duo 128 Elite by Jack Eisenmann. It's a pretty obscure little architecture, but is cool enough to play pong on a 8x8 LED display.
I'm back again fighting the battle against my weight, cholesterol and blood pressure. As a guy who tries to be very rational and very scientific, I'm constantly looking for (and constantly frustrated) by the lack of rigorous and useful scientific data that guides me in trying to change my behavior to have a better health outcome. In doing this, I am frequently frustrated.
For instance, when I was a kid, we were warned about eating to much sugar. Sugar, it was said was the major cause of hyperactivity and tooth decay. In fact, both of those appear to be myths. Parents have been told that their kids will be more hyper when they've had sugar, and often apply a purely subjective standard for their children's behavior depending upon whether they know if they've had sugar or not. And of course pure sugar is relatively less of a problem than starchy carbohydrates because tooth decay is primariy caused by a bacteria which feeds on sticky starch which remains stuck to teeth, which causes them to emit acids which attack tooth enamel.
By the late seventies, the blame had shifted to dietary fats. It seems logical: if you are getting fat, it's probably, well, because you ate too much fat. So, a huge industry began which fed America based upon this wisdom: we saw the creation of larger and larger amounts of "low-fat" and "non-fat" foods, which we as Americans bought in vast quantities. And, the net result of all this was that obesity rates skyrocketed. In other words, when millions of Americans did what experts told them to do and lower their fat content, they became fatter and fatter, and dramatically so.
The Atkin "revolution" told us that much of what we learned was a lie: that in fact it wasn't fat which made us fat, it was carbohydrates. Our new demon was bread: you could eat all the steak you wanted, as long as you didn't eat any of that bread.
The fact of all of this is that food metabolism is very complex, and much of what people say about diet is not only incorrect, but in fact can't possibly be correct. As a for instance, the conventional wisdom is that "a calorie is a calorie". It doesn't matter if you eat fats or carbs or proteins, it all doesn't matter. If you have an excess of around 3500 calories, you will put on a pound of fat.
But this can't possibly be true. Let's say that it were true, and you were one of those lucky people who "hasn't gained a pound since college" (let's say that's 10 years ago). That means that in the span of 3653 days (we'll round), your total caloric intake as compared with your energy expected has to be balanced to within 3500 calories. That requires a precision of about one calorie per day. I burn a calorie in a few seconds of walking on a treadmill. Does that make any sense to anyone?
There obviously has to be more at work here.
I got on this track courtesy of Jay Parkinson's blog. He's an MD, and was writing about a retailer who pulled a T-shirt from their stores which carried the message "Eat Less". He said that he thought it was a bad thing, because anorexia accounted for a mere 1% of the population, while 35% of us were obese. According to figures provided by UN FAO (and linked by Parkinson) Americans consume on average 3790 calories per day. The American Heart Association suggests that I eat between 2000 and 2400 calories as a moderately active adult male in my age group. Let's say that my activity level burns 2200 calories per day, but that I am one of these "average" Americans. This means that I am overeating to the tune of 1590 calories per day. I should put on more than a pound every three days. If weight gain (and loss) were really as simple as Parkinson would have you believe, the real question isn't "why can't I seem to get thin" but rather "why aren't we all bursting our skins?"
Someone left this link to a talk by Gary Taubes on Jay's discussion board. I think it's pretty interesting, and points out some of these fallacies and the likely causes. I think that if I could internalize some of the lessons, it might prove to be helpful.
Can someone (preferably somebody whose very keen on baseball, especially sabermetrics) answer me a question?
Tonight I was at the game between the Athletics and Twins in Oakland. After trailing 1-3, the A's score twice in the bottom of the eighth inning to tie the game up going into the ninth. Danny Valencia strikes out. With nobody out, Justin Morneau replaces Brendan Harris.
Okay, here's the question: you are in a tie game, with nobody on base. Justin Morneau is batting .376, to be followed by Nick Punto, who is batting around .200. What do you do?
The Athletics chose to intentionally walk Morneau. Sadly, they also ended up walking Punto, after they pulled Morneau for a pinch runner. Span then hits a grounder and Punto is out at second, but Span beats the throw, and we have runners at the corners with two out. Tolbert (batting about .167) pokes a shot out to center field, and the Twins score.
I can't understand the utility of walking Morneau. Yes, he's batting .375 or whatever, which means that over 60% of the time, he doesn't reach base, and you then have two outs, facing Punto, Span, and Tolbert. If he singles, you are in exactly the same place you were if you intentionally walked him. So you are betting that 63% of the time making an out is less desirable than 34/191 (34 extra base hits in 191 at-bats) chance of getting an extra base hit. Sure, I haven't quite factored in chance that you accidently walk Morneau, but I can't help but think that the intentional walk is the wrong play.
What do others think?
Two different amateur astronomers detected an object impacting Jupiter on June 3. Catch the video: it's pretty impressive, and shows that amateurs can make interesting observations of our universe. If you haven't looked through a telescope lately (or a good one) this video shows the role that atmospheric conditions play. As you watch carefully, fine details come and go in the span of just a few frames. One of the more interesting "revolutions" in amateur astronomy is to use video cameras to capture these moments of good seeing, and "stack" the resulting images into high quality composites. Anywho.... thought it was brilliant. Congratulations to Anthony Wesley and Christopher Go for these truly rare images.
I was listening to Leo Laporte and Steve Gibson's Security Now podcast as I was commuting this morning, and found that Steve Gibson said something which clarified how I feel about Facebook and Twitter.
Lots of people are upset about Facebook privacy concerns. I'm not really among them. If I post something on Facebook, I pretty much understand that I'm publishing it and won't have any control over where the information goes. And really, how could I expect any different? Facebook's entire business model is to aggregate information about you and share it with others. They don't want your information to be private, because they can't do anything with your private information. Facebook entices you to register by not showing you what your friends are doing unless you do. And then, it entices you to add everyone in your email contact lists. It encourages you to type in information about who you are, when you were born, where you live, and what you are doing. By doing so, it can figure out all sorts of good stuff about you, and sell that information to others.
Facebook has the power to make us all celebrities. But that means that while we might get fans, we might also get paparazzi. Fame has a cost, and we should perhaps come to grips with it ourselves, rather than asking Facebook to do it for us.
Twitter is almost the anti-Facebook. You can view anyone's twitter feed without joining twitter. You can see who are following and who are followers of any Twitter user without becoming a twitter user yourself. To post, you need to register, but the only thing it asks you for is a username, and a "Full Name", which could completely be a pseudonym. Everything about twitter takes place in public, so there is never any concern about privacy: you have none. They aren't selling your information, because any advertisers could already get access to anything you post on twitter. Anyone can.
And when Gibson put it this simply, it made me realize that I'm actually more interested in Twitter as a result. If I wanted to share private information, I already have the means to do so, and probably should do so with more thought than I really give Facebook. But if I want to share information publicly, having a bunch of privacy protections in place is unnecessary.
Yesterday was an intereseting day in baseball. In the last month, we've seen two perfect games pitched: the first by Dallas Braden, and the second by Roy Halladay. For those of you who aren't big baseball fans, those were only the 19th and 20th perfect games recorded in Major League Baseball history. The last time two had occurred in the same year was in 1880.
Which brings us to yesterday. June 2, 2010, in a matchup between the Detroit Tigers and the Cleveland Indians. The pitcher for the Tigers was Armando Galarraga, who had recently been called up from the Detroit Triple-A affiliate and placed in the starting rotation. His ERA going into the game was an unremarkable 4.50.
He pitched eight and two thirds innings, with no hits, and no walks. Another perfect game in the making? The batter was Jason Donald, who hit a grounder to right field which was fielded by Miguel Cabrera, who tossed to Galarraga, who was covering first base. A perfect game!
But wait... the umpire Jim Joyce called Donald safe!
Wow. If there is one thing that is even rarer than perfect games, it's perfect games that are spoiled by the 27th batter. There were nine prior to last night. I actually was lucky enough to see one (on TV, not live) when Mike Mussina of the Yankees gave up a hit to Carl Everett of the Red Sox in September, 2001 (the last time it happened).
But here's the tragic thing: the umpire completely blew the call. Donald was out by a step. A long step. Joyce just flat out blew the call. Upon seeing the replay, he admits he blew the call. But baseball doesn't have instant replay, so the ruling stands, and Galarraga misses out on being the 21st perfect game hitter.
Okay, that;'s the background: here's my take.
Give the kid the perfect game. Donald was clearly out. As far as I can tell, everyone involved, from teams on both sides to the umpire agree that he should have been called out. It would have been the end of the game, so there is no needless speculation of how it would have changed the game: the game would have been over, except that Donald has one less hit in his batting average, and Galarraga would be properly recorded as the 21st pitcher to throw a perfect game. Any other outcome is a travesty of rules over substance. The rules should enable us to get the call right, not require that a wrong call be made official.
And cut Jim Joyce some slack. He blew a call. Yes, it was a bad call, but he freely admits and would absolutely reverse his call if it were in his power to do so. You don't make mistakes at your job? Get over it.
it seems like the last month has been rife with stories of corporations doing things that annoy and irritate their customers. Facebook privacy concerns. Google sniffing Wi-Fi. Apple rejecting apps for inscrutable reasons. Heck, BP not checking their blow-out preventer.
Each of these have caused me a bit of annoyance: some perhaps more than they should, some less than they should. But the subject of today's rant is AT&T's New Lower-Priced Wireless Data Plans. Their own press release says that customers can choose between two new more affordable plans: "either a $15 per month entry plan or a $25 per month plan with 10 times more data."
Wow, sounds great huh?
Well, for some people it probably is. Maybe even for me. Let's look at my data usage over the last few months:
As you can see, I'm probably safe in the 2GB usage pattern, so I could in theory sign up for the 2GB data plan for $25 and save myself $5 a month. So why am I unhappy?
First of all, there are overages. Cell phone companies love to charge you overages. Let's say that you sign up for the DataPlus plan, and use 201 megabytes in a month, instead of 199 megabytes. AT&T will nicely charge you $15 a month extra for the next 200. And $15 for the next one after that. Let's say you have a bad month, like I did, and you use 1GB of transfer. AT&T will charge you $75 for that 1GB. If you just signed up for their DataPro plan, you would be charged only $25, and you'd get twice the data for that. It's not like they have to pay some human overtime to come in and move your data around: the data is already delivered. They could charge you less, but they choose to charge you more, to entice you to do what I do, which is to sign up for a more expensive plan as a hedge against large overages. My current unlimited data plan is the best kind of hedge: a fixed rate plan. I typically use about 1/4 of what the 2GB limit would give me, but I don't have to worry: if I need the bandwidth, it's there. With the new proposed plan, I have no such guarantee, meaning I have to watch my usage more carefully, which is an added mental annoyance that I didn't have before.
While I'm on this kick, here's another pet peeve about overages. The phone company knows how many minutes you've used. They know how much data that you've used. They could just give you the option of having your service stop when you reach your limit. Previously, this was declared "impossible" by AT&T, with the net result that in a fit of teen... shall we say... indiscretion my son managed to run up a $700 cell phone bill by exceeding his minutes. Now, AT&T has the mechanism in place, but will charge you $4.99 for that privilege. That's just extortion.
The second thing that annoys me is that AT&T is finally offering cell phone tethering. "What's annoying about that, you ask?" Well, previously you couldn't get tethering on the iPhone from AT&T, but today, they annouced that you can get it for $20. And for that... you get.... well, pretty much nothing. Yes, you can hook your laptop to the network via your iPhone, but you don't get any extra bandwidth. AT&T is charging you more for the bits you sent from your laptop, based solely on their point of origin. Sure, they might reasonably expect that users who make use of tethering will use their data connections more, but they already are going to shaft you when you hit your overages anyway. That's what those overages are meant to deter. To spend an extra $20 on top of that seems absurd.
Lastly of course, the sizing of these plans may be adequate today, but as network speeds improve (well, on OTHER cellular networks anyway) and as the demand for more bandwidth from applications like video grows, these plans will grow increasingly burdensome for more and more consumers. Which, of course, AT&T will be happy to charge you for, as you rack up more overages.
I get the motivation: there are people out there who use many times even my usage, and pay no more than I do. I've heard that 3% of cell phone users account for 40% of all data transmitted in the Bay Area (read it somewhere today, didn't save the link, but even if the number is wrong, it's probably not very wrong). Obviously, AT&T would love to get those people off the network, or at least lower their usage, because then the network behaves as if they upgraded it: they have more available bandwidth that they can sell to more customers. Heck, I'm not even really objecting to the pricing: it's a powerful incentive to lower usage of the 3%, while actually lowering my bill MOST of the time by $5 a month. But cell phone companies already have a lot of hostile practices in place that are bad for consumers. They charge a fortune for text messages, which is idiotic. They limit voice minutes, and place no real limit on overages. They charge you for early termination. Activation. Enough. We love our cell phones, we want to use your product, but you guys have to toss us a bone once in a while.
I suspect that it might be better to pick up an iPod touch and run Skype most of the time, and get a Pay As You Go cell phone to keep in my car for emergencies. I'd probably save $400 a year on cell phone bills, and it would piss me off less.
This concludes the rant of the day.