Okay, so here’s the deal: I’ve been working on my weather satellite decoder off and on for a few weeks, and been cleaning up the code, trying to make it simpler and better explained in preparation for eventual publication. I’m now at the point where I’m trying to do the whole “sync detection” part, so that each scanline is placed directly below the previous one. But if you look at the image on the right, you’ll see that the problem isn’t just finding the one, correct line length, the lines actually vary in length, becoming longer as the picture moves down, causing the image to drift right.
I immediately thought, “this has something to do with Doppler shift”, which is correct, but which I didn’t have any kind of real justification or mathematical basis to my understanding. It was, of course a guess. I was walking back from lunch, explaining this to Tom, and I said “the odd thing is that I don’t understand how the lines could be getting longer.” After all, the satellite emits 2 scanlines per second from its point of view, that doesn’t change when its received, right?
Tom looked at me like I was stupid.
Which was of course enough to spawn me to rethink, and realize that yes, I was being stupid. Let’s say that the satellite emits a 100hz tone for one second. The received signal will be Doppler shifted up or down based upon the relative velocities of the satellite and the ground station. If the satellite is moving towards the ground station, the Doppler shift will move the frequency up. The received signal will complete the same number of cycles, but at a higher frequency. That means it has to take less time. Similarly, for a satellite moving away, the frequency will be lower and each scanline will be greater than the nominal frequency.
In my case, at AOS, the satellite is approaching, and the line rate will be greater than the nominal 2hz. At max elevation, the satellite is moving perpendicular to the observation vector, and the line rate will be precisely (well, not precisely, as pointed out by Steve below, but close enough for my purposes…) 2hz. At LOS, the line rate will be less than 2hz. In each case, successive lines are longer than previous lines, which causes the nominal trend to the right we see in the images as decoded by my simple decoder.
Duh Einstein. I find again that when you think about things the right way, the answer seems obvious.
Addendum: if we could trust the sampling rate of the sound card, we could determine the exact point where the satellite passed max elevation, by finding the place where the line rate was precisely 2hz. I’m not sure how accurate your average sound card is. I’ll think about it some more.
Actually, the point where the Doppler shift is 0 is not when the satellite is directly overhead; when the satellite’s velocity vector is exactly perpendicular to your line of sight to it, it is still experiencing a bit of time dilation due to its relative velocity and its apparent frequency will still be shifted down a bit. The point at which the satellite’s Doppler shift is 0 will actually be a bit before it passes directly overhead, when the Doppler shift up from its relative approach velocity cancels its time dilation.
If it wasn’t so late at night I’d do the math to get the angle based on the velocity.
I know that musing about relativistic physics is one of your passions Steve :-), but my intuition tells me at the velocities and the line rate that I’m dealing with, any time dialation will result in errors which are much smaller than a pixel. You are correct though: I was not taking relativity into account, and it was wrong of me to invoke the name Einstein without doing so.
For now, I’m trying to get a good curve fitter working, then I’ll apply RANSAC, then hopefully I’ll be able to resample the image properly.