Dusted off my Softrock Lite 2 for 40m

I haven’t blogged much about my Softrock 40 since I finished its construction. I knew that it basically “worked”, but I was pretty sure that the opposite sideband cancellation was quite poor, and I wasn’t thrilled with the various software options that I had available, so it sat on my dining room table for a while. But this weekend was the CQ WPX World Wide CW contest, so I figured the bands would be crowded with CW signals, so I thought I’d dust it off. I also have a new netbook computer that my wife got me for my birthday, so I downloaded Rocky, a recommended application for software defined radio, and gave it a whirl. I was using an EMU 0202 USB sound card, which I was unable to get to work at 96khz, but at 48khz, all seemed well. I recorded 15 minutes of audio to play with, and generally had some fun (I might do a screencast of it sometime in the future), but I also dusted off my program that I wrote, and decoded 15 seconds of the audio, yielding the following on the bottom:

Centered around 7.056 Mhz, decoded with my own software
Centered around 7.056 Mhz, decoded with my own software

One of the diagnostics suggests that I am not getting one of the channels at all, which certainly would explain why I am getting no opposite sideband cancellation. I’ll try hooking it up to the oscilloscope and seeing what’s the problem.

If anyone wants to play with the audio I used to generate the stuff above, here’s a link to the 4.1Mbyte file. Let me know what you discover.

Addendum: Yep, one channel is completely dead. Not sure if it is a circuit problem of a cabling problem.

Audio File, normalized, viewed in Audacity
Audio File, normalized, viewed in Audacity
Same file, with Spectrum
Same file, with Spectrum

Electronic Publications of Richard Pask

I’ve run across this page a couple of times, but keep forgetting to bookmark it. Pask has written a great deal about checkers, and these are very nice free references for the aspiring checkers player to have on hand. In particular, his Key Openings is of particular interest to me at the moment.

Electronic Publications of Richard Pask

CQ World Wide WPX Contest Coming This Weekend

I was reminded that the CQ WW WPX contest is coming up this weekend. I’m not much of a contester, but perhaps it would be a good opportunity to get my Softrock receiver for 40m online and to record a few hours of activity as the basis for future DSP experiments. Activity is always high for this contest, and it will provide an interesting test.

CQ World Wide WPX Contest – Home

Attempt at Lightning Photography

Some people have been using the CHDK firmware to do motion detection and capture pictures of lightning strikes. Despite the fact that lightning has been hitting Tampa all week, I didn’t get around to trying this until last night. I wasn’t entirely successful, but I did capture these two exposures:

This lightning strike hit almost directly overhead, and was really bright and loud.

Milhouse needs an infusion of actual checkers knowledge

This morning, I woke up and was trying to resolve a problem: why is Milhouse (my checkers program) so slow? The answer didn’t seem entirely obvious, so I pondered it some more. I refined it to “why is Milhouse so slow when it accesses the endgame database files?” I think I know why, but unfortunately, it will require some additional thought and work to remedy.

To explain, let me recap what happens. Put in simple terms, Milhouse does an alpha-beta search to some depth. As it is traversing in depth first order, it may realize that certain paths are futile (it can prove that no leaf in the subtree will be the right move regardless of what value it has, even without searching). Eventually, you’ll get down to a leaf node, and you’ll “evaluate” it (run a simple function which gives a figure of merit). These values get “backed up” through the tree and one of them ultimately wins as the value of the search.

In other words, every time you search, the score that you get is the evaluation of some node on the search frontier.

Let’s imagine for a moment that you have a simple evaluation function. Say you count men (unpromoted checkers) as 100 points and kings as 130 points, and the evaluation is the sum of all your scores minus the sum of your opponents scores. That’s certainly a reasonable, but relatively un-knowledgeable evaluation function, but it would “work”. The problem is this: it doesn’t differentiate very finely. Many, many positions on the search horizon will have identical scores. This significantly reduces the efficiency of search, because the “backed up values” can’t do their job efficiently: there are many subtrees with nodes on the search frontier that look just as good as the nodes you’ve already looked at. Hence, you end up searching and ultimately evaluating many more nodes than you could if you took more information into account.

Ultimately then, “knowledge is power”. Improving the evaluation function to include more information has an effect on overall search efficiency. The more knowledge it has in being able to differentiate between the merits of two positions, the fewer overall nodes it will have to evaluate in search (the fewer subtrees it will have to resolve all the way to the leaves).

My evaluation function doesn’t understand anything about runaway checkers or trapped kings, and therefore it has to use deeper searches to resolve situations that include these features. When I’ve played against Cake, I’ve noticed that not only does Cake search deeper faster, but Cake seems to recognize that it’s got me in trouble in searches of much shallower depth (sometimes as many as five moves earlier).

So, I’ve got to help Milhouse learn more about checkers. I can do this two ways: one, by learning more about checkers myself, and putting some of that knowledge into Milhouse. The second, by learning more about machine learning, and then allowing Milhouse to learn more itself, either through self-play or training.

I suspect that machine learning might be more productive, but I’m not entirely convinced. I’ll have to give it a bit more thought.

Addendum: Okay, that was highly abstract, let’s work through an example. I’ve been trying to absorb some knowledge from Checkers legend Derek Oldbury, whose Move Over is available online. Here’s Diagram 17, a fairly simple position:

From Derek Oldbury, Diagram 17.  White to move, and is lost...
From Derek Oldbury, Diagram 17. White to move, and is lost...

If it were Black’s turn to move, it would be a draw (both men promote, and a boring 2×2 King draw ensues) but if White is to move, clearly, he can’t move his king (backing away pins him against the wall, advancing is suicide), so he must move his man. Putting him against the wall is hopeless, the black man takes the opposing position and waits to jump him. Therefore, the line goes 24-19 8-11 19-15 18-22 15×8 22×13. There is an exchange of material, but while white will king his man, he’s still doomed, as he can’t escape to the safety of the double corner. Here’s how the search looks using milhouse without its endgame database:

milhouse: restore diagram.17
board restore from diagram.17
	| - - - -|
	|- - - - |
	| w - - -|
	|- - R W |
	| - - - -|
	|- - - - |
	| r - - -|
	|- - - - |
... White to move
... FEN: W:WK17,24:B8,K18

milhouse: id
      +1 : [-11] 0.00s
         : 24-19
      +3 : [-15] 0.00s
         : 24-19  8-12 17-13
      +5 : [-12] 0.00s
         : 24-19  8-11 19-15 11-16 15-10
      +7 : [-19] 0.00s
         : 24-19 18-23 19-16 23-18 16-12  8-11 12-8 
      +9 : [-16] 0.01s
         : 24-19  8-12 17-21 18-22 19-15 12-16 15-10 16-19 10-6 
     +11 : [-12] 0.02s
         : 24-19  8-11 19-15 18-22 15x8  22x13  8-3  13-9   3-8 
         : 9-14  8-11
     +13 : [-14] 0.03s
         : 24-19  8-11 19-15 18-22 15x8  22x13  8-3  13-9   3-8 
         : 9-14  8-11 14-18 11-16
     +15 : [-17] 0.10s
         : 24-19  8-11 19-15 18-22 15x8  22x13  8-3  13-9   3-8 
         : 9-14  8-11 14-18 11-16 18-15 16-12
... researching after failing low (-42)
     +17 : [-9983] 0.30s
         : 24-19  8-11 19-15 18-22 15x8  22x13  8-3  13-9   3-8 
         : 9-6   8-11  6-10 11-16 10-15 16-12 15-11 12-16
final score = -9983
0.327 seconds elapsed.
... 231735 nodes searched, 65777 nodes evaluated
... 30493 alpha nodes, 5483 pv nodes, 65965 beta cutoffs
... avg branching factor 1.12
... 11096 nodes ordered
... 19060 pv hits, 16916 pv misses (52.98%)
transposition tables
... 166190 gets (85501 tier0 / 1072 tier1 / 79617 failed)
... 101941 puts (85701 tier0 / 16240 tier1) (30493 alpha, 5483, 65965 beta)
... 0 chinook database accesses.

The output bears some explanation. it uses iterative deepening to search, adding two ply to each search as it deepens. So, it starts out with one ply depth, then three, then five, and so on, up until it discovers a winning line with a 17 ply search (it’s actually a 19 ply winning line, quiescence search makes it resolve the final jump that wins the game). You’ll notice that at each search depth, it believes 24-19 to be the best move (it is) but that right up until the realization that the game is lost, (a 15 ply search) it believed the game to be essentially even. Had I encountered the starting position at the leaf node of a search, it might have believed it could salvage a draw from this positon, and selected it over a position which actually provided a draw.

Now, let’s look at the same search, but using the endgame database.

board restore from diagram.17
	| - - - -|
	|- - - - |
	| w - - -|
	|- - R W |
	| - - - -|
	|- - - - |
	| r - - -|
	|- - - - |
... White to move
... FEN: W:WK17,24:B8,K18

(DB W LOSES) milhouse: depth 99
search depth is 99
(DB W LOSES) milhouse: id
      +1 : [-7614] 0.00s
         : 17-13
      +3 : [-7614] 0.00s
         : 17-13 18-14 24-19
... researching after failing low (-7712)
      +5 : [-7712] 0.01s
         : 17-13 18-14 13-9  14x5  24-19
... researching after failing low (-7808)
      +7 : [-7808] 0.01s
         : 24-19  8-11 19-15 18-22 17x26 11x18 26-30
      +9 : [-7810] 0.01s
         : 24-19  8-11 19-15 18-22 15x8  22x13  8-3  13-9   3-7 
     +11 : [-7812] 0.02s
         : 24-19  8-11 19-15 18-22 15x8  22x13  8-3  13-9   3-7 
         : 9-14  7-2 
     +13 : [-7812] 0.02s
         : 24-19  8-11 19-15 18-22 15x8  22x13  8-3  13-9   3-7 
         : 9-14  7-3  14-10  3-8 
     +15 : [-7812] 0.03s
         : 24-19  8-11 19-15 18-22 15x8  22x13  8-3  13-9   3-7 
         : 9-14  7-3  14-17  3-7  17-14  7-3 
... researching after failing low (-9983)

     +17 : [-9983] 0.04s
         : 24-19  8-11 19-15 18-22 15x8  22x13  8-3  13-9   3-7 
         : 9-14  7-3  14-10  3-8  10-15  8-12 15-11 12-16
final score = -9983
0.075 seconds elapsed.
... 27737 nodes searched, 4147 nodes evaluated
... 4573 alpha nodes, 124 pv nodes, 9520 beta cutoffs
... avg branching factor 1.09
... 55 nodes ordered
... 3388 pv hits, 1309 pv misses (72.13%)
transposition tables
... 23599 gets (9081 tier0 / 15 tier1 / 14503 failed)
... 14217 puts (14125 tier0 / 92 tier1) (4573 alpha, 124, 9520 beta)
... 3852 chinook database accesses.
(DB W LOSES) milhouse: 

In spite of the fact that it loaded 3852 positions from the database cache, it actually ran 7x faster. You can see that it searched and evaluated many fewer nodes. The highly negative scores even at the initial search depth of one means that it knows this position is lost immediately, so it effectively gives you the benefit of an additional 17 ply of search, should this position show up as a leaf in the search tree. What is needed is to create an evaluation function which more closely predicts the value of deeper searches without the expense.


Here’s an example position that causes difficulty without the endgame database:

What should the score be, with red to move?
What should the score be, with red to move?

Without the endgame databse, milhouse scores this position as +6 for red, but of course, it’s actually won (it’ll take about 7 more ply of search depth to discover). If you look at how it scores the possible moves, you get:

analyzing board position...
	    -6 :  6-1  11-15 ...
	    -4 :  6-2  11-15 ...
	    +5 :  6-9  11-15 ...
	   +12 :  6-10 11-8  ...

You’ll see it selects the right move, but the reality is that the first three moves are all draws, where the last move allows the win.

The Three Projections of Doctor Futamura

No, this isn’t the sequel to the movie “The Castle of Fu Manchu”. It’s a very interesting bit of computer science that deals with the construction of interpreters, compilers, and specializers. I’m dashing out the door with the family this morning, but check it out, courtesy of the “A Neighborhood of Infinity” blog. Unlike most descriptions, this one is fairly easy to understand, although it gives very few hints on how to actually construct the various bits needed. But hey, that’s what a computer science degree is for.

A Neighborhood of Infinity: The Three Projections of Doctor Futamura

Addendum: I was dashing out the door when I read this, but I should expand. Neighborhood of Infinity is the blog of Dan Piponi, a very, very, very clever guy. I don’t consider myself a slouch, but I doubt I’ll live long enough to understand all the stuff he’s written about on his webpage and blog. Great, great stuff. He’s one of the primary reasons that I keep trying to move Haskell to the top of my list of things to learn about, as his writing has given me glimpses into a type of programming beyond which I normally can achieve. Inspiring stuff.

Fast Shutter with Canon SD1100 + CHDK

I have a little Canon SD1100 that I picked up a while ago. One of its cool features is its ability to run the alternative CHDK firmware. It includes many interesting features, such as scripting, improved time lapse photography and such, but it also includes the ability to do really fast shutter speeds. I was dinking around with it a bit yesterday for a few minutes, and got this picture, taken with a 1/16000th second shutter speed. It’s not great as a photograph, but it hints at what is possible.

One of my first attempts at macro/high speed photos using a Canon SD1100 and the CHDK alternative firmware.   Shutter speed was 1/16000th of a second.
One of my first attempts at macro/high speed photos using a Canon SD1100 and the CHDK alternative firmware. Shutter speed was 1/16000th of a second.

Addendum: Here is some more information, and more impressive photographs.

Addendum2: Here’s another try, just trying to catch water drops hitting the bottom of the sink.

Water drops from the faucet hitting the bottom of the sink.
Water drops from the faucet hitting the bottom of the sink.

Happy Birthday Sir Arthur Conan Doyle!

Last year, I noticed that Google had changed their search page to a Sherlock Holmes inspired theme in honor of Sir Arthur Conan Doyle’s birthday. This year, I see that they have chosen to honor Mary Cassatt instead. Well, fine, happy birthday to all, even to Naomi Campbell. But I’ll be listening to some of my favorite Sherlock Holmes stories on my iPod (if you don’t have ’em, try clicking on the first link above, and it will tell you how to get ’em).

Flaws in Milhouse’s Play

While debugging some more challenging positions from The Checker Maven, I encountered the following position which seemed to be giving Milhouse fits.

White to moved, and should be doomed, but isn't against Milhouse.
White to move, and should be doomed, but isn't against Milhouse.

In debugging the preferred line, it seems to be stuck in a short repeating cycle which it views to be best. In reality, the repeated positions should be viewed as draws, which should be enough for them to reject the line, since the database knows that the line is a win. But my program does nothing to detect repeated positions (I briefly flirted with it, but didn’t like the implementation, so I discarded it.) I’ll have to figure this out to solve positions like this.

Gould Puzzle #516

I started working through my analysis of Gould Puzzle #516, which my previous scan using the Cake 8 piece database, and revealed to be a loss for white, instead of the draw that Gould listed. Here’s the position:

Gould Position #516
Gould Position #516

The database reports this position as lost, but Gould thinks it is a draw. Let’s see where the published play goes wrong. Gould thinks that the play should proceed 6-2, 26-23, 19-16, 12-19, 2-7, 11-15, and 1-6, leading to the following:

Gould suggests that red should make 23-27, but that move tosses away win.   14-18 enables you to retain the extra man.
Gould suggests that red should make 23-27, but that move tosses away win. 14-18 enables you to retain the extra man.

Gould listed play as proceeding with 23-27 (which is a drawing move), 6-10, 14-18, 20-16, 18-22, and 10-14 leading to a position where each side has three kings. But Red can maintain his one piece lead by playing 14-18 instead. You can walk a narrow line which is a pretty nice little ballet to advance all the red men, leading to a 4 on 3 endgame, which is tricky, lengthy, but very winnable.

Addendum: There is a similar flaw in the other variation presented.

Gutenberg Gems: Philosophical Transactions Vol. I.

vi_n14_sch3I was scanning through the recent additions to Project Gutenberg, and found that Volume I of the Philosophical Transactions of the Royal Society, dating back to 1666 had been converted and uploaded. It’s got a lot of interesting stuff in it (as well as some uninteresting) but the thing which caught my eye was this description of some observations of Jupiter done by Hook using a sixty foot(!) telescope. In 1666, the achromatic telescope had not been invented, and telescopes had very small aperature and very long focal lengths to avoid chromatic aberration. Yet using these telescope, Hook wrote very clearly about shadow transits crossing the surface of Jupiter. Worth checking out.

Philosophical Transactions Vol. I.

Some Observations lately made at London concerning the Planet Jupiter.

These, as they were made, so they were imparted, by Mr. Hook, as follows:
Sch. III.

A. 1666, June 26. between 3. and 4. of the Clock in the morning, I observed the Body of Jupiter through a 60 foot-glass, and found the apparent Diameter of it through the Tube, to be somewhat more than 2. degrees, that is, about four {246}times as big, as the Diameter of the Moon appears to the naked Eye. I saw the Limb pretty round, and very well defin’d without radiation. The parts of the Phasis of it had various degrees of Light. About a and f, the North and South poles of it (in the Fig Q.) ’twas somewhat darker, and by degrees it grew brighter towards b. and e, two Belts or Zones; the one of which (b) was a small dark Belt crossing the Body Southward; Adjoyning to which was a smal Line of a somewhat lighter part; and below that again, Southwards, was the great black Belt c. Between that, and e, the other smaller black Belt, was a pretty large and bright Zone; but the middle d, was somewhat darker than the edges. I perceiv’d about 3h. 15m. near the middle of this, a very dark round Spot, like that represented at g, which was not to be perceiv’d about half an hour before: And I observed it, in about 10. minutes time to be gotten almost to d, keeping equal distance from the Satelles h, which moved also Westwardly, and was joyn’d to the Disk at i, at 3h. 25m. After which, the Air growing very hazy, and (as appeared by the Baroscope) very light also (in weight) I could not observe it: So that it was sufficiently evident, that this black Spot was nothing else, save the shadow of the Satelles h, Eclipsing a part of the Face of Jupiter. About two hours before, I had observed a large darker spot in the bigger Belt about k, which in about an hour or little more (for I did not exactly observe the time, nor draw the Figure of it) moving Westwards, disappear’d. About a week before, I discover’d also, together with a Spot in the Belt c, another Spot in the Belt e, which kept the same way and velocity with that of the Belt c. The other three Satellites in the time of this Eclipse, made by the Satelles, were Westwards of the Body of Jupiter; appearing as bright through the Tube, as the Body of Jupiter did to the naked Eye, and I was able to see them longer through the Tube, after the day-light came on, than I was able to see the Body of Jupiter with my naked eye.

47 Some 28? mistakes in Gould’s The Game of Draughts

Earlier, I blogged about the historic book available from Google Books called Gould’s The Game of Draughts. While surveying a few problems, I noticed that my play was significantly different, and included a number of different outcomes for a couple of tested positions. So, I decided to use the newly added 8 piece Cake endgame database (thanks Martin, for releasing the code and the databases) and see if there were any mistakes I could find. Mind you: these are mistakes which the database points out Immediately, it doesn’t actually point out mistakes in play that require search. I found 47 errors, which for fun, I’ll list below.

I think I am going to (for fun) systematically work through these, and document the improved lines.

Addendum: I seem to have a bug in my code. The list wasn’t correct. I’ll replace it when I verify the results.

Addendum2: Here’s my list of 28 mistakes. I think it’s right, but it might not be, I’m still confused by some of the database issues.

916 : Gould’s Problems #30 is actually a loss for white
986 : Gould’s Problems #97 is actually a draw
1019 : Gould’s Problems #130 is actually a loss for red
1023 : Gould’s Problems #134 is actually a loss for red
1025 : Gould’s Problems #136 is actually a draw
1033 : Gould’s Problems #144 is actually a draw
1086 : Gould’s Problems #197 is actually a loss for red
1094 : Gould’s Problems #205 is actually a draw
1107 : Gould’s Problems #218 is actually a draw
1108 : Gould’s Problems #219 is actually a draw
1171 : Gould’s Problems #281 is actually a draw
1404 : Gould’s Problems #516 is actually a loss for white
1658 : Gould’s Problems #767 is actually a loss for red
1660 : Gould’s Problems #769 is actually a draw
1668 : Gould’s Problems #777 is actually a win for red
1677 : Gould’s Problems #786 is actually a loss for white
1679 : Gould’s Problems #788 is actually a draw
1694 : Gould’s Problems #803 is actually a loss for red
1770 : Gould’s Problems #879 is actually a draw
1774 : Gould’s Problems #883 is actually a draw
1775 : Gould’s Problems #884 is actually a loss for white
1817 : Gould’s Problems #924 is actually a loss for red
1821 : Gould’s Problems #928 is actually a win for red
1902 : Gould’s Problems #1009 is actually a loss for white
1923 : Gould’s Problems #1030 is actually a win for red
1930 : Gould’s Problems #1037 is actually a draw
1958 : Gould’s Problems #1065 is actually a loss for white
1972 : Gould’s Problems #1079 is actually a loss for white

Addendum3: Check out this position from the list above with nine pieces:

The database code sees this as a draw, in spite of the fact that it only has up to eight pieces, because the capture resolution code removes one capture from the game before it does any database lookups.

From Gould's <em>The Art of Draughts</em>, problem #281
From Gould's The Art of Draughts, problem #281

Learning something about printf, of all things…

I’ve been C programming for… (quick arithmetic) roughly 25 years now, and yet, there are still things to learn. For instance, I decided to move my code for Milhouse back from my AMD64 Linux box to my Macbook for a little “mobile hacking” over the next week. I quickly found that unlike gcc on the Linux box, gcc on this Mac still thought that “long” variables were 32 bit. Various counters in Milhouse are 64 bit values, as are the hash values that are used in the transposition table, and I quickly found out that all the places where I previously used “%ld” as a format string had to be changed to “%lld”. Grumble!

You see, here’s the annoying thing about C: you know that shorts can hold char values, and ints can hold shorts, and longs can hold ints, but you actually don’t know how many bits any of these have without peeking using sizeof. Luckily, the C standard requires an include file sys/types.h which has typedefs which include types of various sizes, so if you really want a 32 bit unsigned int, you can use the type uint32_t and be reasonably sure that it will work. Such was the state of my knowledge a couple of days ago.

But here’s the thing: I didn’t know any way to generate the right format string for a particular size of data value. On my AMD box, %ld is used for 64 bit values. On my mac, I need to use %lld. Grumble!

But apparently this was all thought of by the C99 standards committee. They created an include file called inttypes.h which includes defines which tell you what format is needed. For example: PRIu64 is the code for a 64 bit unsigned integer value. On my mac, it expands to "ll" "u", which the C preprocessor is nice enough to cat together. Therefore, to print such a value, you need a line like:

printf("%" PRIu64 "\n", sixtyfourbitvalue) ;

Sure, it’s ugly. You think they would at least include the % in the macro. But, it does work. I’m tidying up all my code to play by these nice rules.

Gould’s The Game of Draughts is on Google Books, and archive.org

I was goofing around with my program, and had loaded a problem that was supposedly a win for Red, and set it searching. After searching to thirty or so ply, Milhouse was still convinced the position was drawn. I then tried examining it in Cake to 10 ply deeper, and it was still looking like a draw. I wondered what the analysis of the position was, so I tried to see if I could find a copy of Gould’s book online. I found it first on archive.org, but the quality of the scan is pretty uneven: some of the pages are misaligned and cutoff. But luckily, I found that Google Books had digitized it. For some odd reason, downloading the PDF seems to result in a file which has all the diagrams stripped out, but if you page through it using their online viewer, it works just fine. The PDF might include some elements which require real Adobe Acrobat to render properly, but the overall quality is quite good. Here’s the particular puzzle I was looking at:

Text not available
The game of draughts. Problems, critical positions, and games By Joseph Gould

Here’s a link to the page that contains the analysis. I’ll try to present my “refutation” after I work through it a bit more.