Trying to understand the drama around WordPress…

December 23, 2024 | My Projects | By: Mark VandeWettering

I’ve used the open source version of WordPress for some twenty years. In general, I’ve been pretty happy, although not without some misgivings, mostly technological, but increasingly ideological as well. There has been a trend over the last few years where the conflict between open source software and commercial entities has become seemingly problematic. For the modest purpose of my blog, I’m merely considering finding a less problematic bit of software to use, but I found this video on Youtube that attempts to explain the controversy. In 2025, I think finding a less popular/less problematic platform will be something that I try to do. What do you all think?

Another chapter in the “I’m dimwitted” theme from Advent of Code, 2024…

December 19, 2024 | My Projects | By: Mark VandeWettering

Warning, spoilers ahead for those who are still interested in doing the problems themselves.

Part 1 of Day 19 was pretty simple, really. You could go ahead and read the specification yourself, but basically you have a relatively large number of text patterns which consist of jumbles of a relatively small number of characters (examples include “gwugr” and “rgw”) and you need to find which of a longer sequence (like “guuggwbugbrrwgwgrwuburuggwwguwbgrrbbguugrbgwugu”) can be constructed out of repetitions of some combination of these patterns. In the context of the problems, the long sequences are “towels” (read the problem description) and consist of any number of “stripe patterns” (hereafter referred to as patterns). Some towels will not be constructable out of the given stripe patterns. Part1 was to simply determine which could, and which could not.

It’s the kind of problem that is not particularly well suited for humans, but which has been studied extensively for decades. The theoretical framework is called “regular expressions” or “Discrete Finite Automata”. When I took a compiler class back in the mid 1980s, we wrote our own lexical analyzer generator that used techniques which were (from memory) well described in the classic text Compilers: Principles, Techniques and Tools by Aho, Sethi and Ullman, aka “the Dragon Book”. I still have my copy.

Anyway, regular expressions. Python (my chosen dagger which I use to approach all the Advent of Code problems) of course has a well developed regular expression library, which I thought might be a clever way to solve Part 1. In less than 12 minutes, I had this code:

#!/usr/bin/env python

import re

# data was in the format of a big chunk of small patterns,
# followed by list of towels we need to construct.

data = open("input.txt").read()

# find the patterns, and build a regular expression

patterns, towels = data.split("\n\n")
patterns = patterns.split(", ")

# "any number of any of the patterns, consuming the entire string."
regex = '^(' + '|'.join(patterns) + ')+$'

print(regex)

h = re.compile(regex)

c = 0
for design in towels.rstrip().split("\n"):
    if h.match(design):
        c += 1

print(c, "designs are possible")

It’s simple, and it works. Because it leverages the “re” standard library, it might be expected to run quickly, and it does. On my cheap HP desktop I got from Costco, 18ms is enough to process all the input data and give the correct answer. I made one minor mistake, forgetting to add the caret at the beginning of the regex and the dollar sign at the end to make sure the entire string, but twelve minutes in and I had completed Part 1. Not the fastest (I was the 1600th to complete this phase of the puzzle, which isn’t bad by my standards) but I thought credible.

And here’s where I think I went wrong. In part 2, we are asked to determine the number of possible ways that each towel could be constructed, and add them up. My code in Part 1 was pretty useless for figuring that out. And because I am a smart guy with tons of experience and I’ve read the Dragon Book and the like, as well as Russ Cox’s excellent summary of the theory of fast pattern matching and perhaps it was the late hour (back in graduate school I made a rule not to write code after 10 pm, because my code would always seem stupid the next morning when I had coffee) but I embarked upon a complex, pointless, meandering exploration where I tried to recall how maybe I could use the same sort of techniques that I used oh-so-many years ago to solve this efficiently. I’ll outline basic notion, to tell you just how deep in the weeds I was. I didn’t get this to work, but I do think it is “sound” and could have been made to complete the second path.

Basically, the notion is to use a discrete finite automata to do the heavy lifting. (I’m doing this from memory, so please be patient if I don’t use the right or most precise terminology). Basically, we are going to generate a state machine that does all the heavy lifting. To make it somewhat simpler, let’s look at the basic example from original problem description:

r, wr, b, g, bwu, rb, gb, br

brwrr
bggr
gbbr
rrbgbr
ubwu
bwurrg
brgr
bbrgwb

When we begin, we are in “state 0”, or the start state. We will transition to other states by reading a character from the input, and then transitioning to different states (as yet undetermined). If we end up in the end state, and there is no more input to be processed, then we match, otherwise, we don’t and return failure.

We can build our transition table by looking at the “first” and “follow” sets (I didn’t have the Dragon book in front of me, but here is the idea):

At the start of input we are in State 0. If we look at the first character of any of the patterns, we see that they can be ‘r’, ‘w’, ‘g’, ‘b’. If we see a ‘u’, then we have no match, and would go to a “fail state”, and return zero.

But let’s say we are processing the input “brgr”. We are in State 0 or S0, looking at a character “b”. Let’s create the state S1. S1 would be “transitioned from the start state, and saw a b”. What are the possible patterns that we could still be in? We could have matched “b” pattern, and be back in a state where all the possible patterns are possible. Or, we could be in the “bwu” pattern, having read the b. Or we could be in the “br” pattern. Let’s say that S1 means “we could have just completed the “b” pattern, or we could be in the “bwu” pattern, or we could be in the “br” pattern”.

Now, what transitions happen after S1. Well, we create a new state for each of the possibilities. If we completed the B pattern, we are back into a state which looks identical to S0, we we may as well use it. If we read a w, we know we can only be in the state where the “bwu” is active. Let’s ignore that for now. If we read an “r”, we could have matched the “r” pattern. Or we could have completed the “br” pattern. So, we create another state for that…

This is the rabbit hole I fell down. I am pretty sure it could be made to work, but it was complicated, and I wasted way to much time (see “sunk cost fallacy” for why that might happen) trying to figure out how to make it work, including how to track the number of different ways each pattern was matched.

It was dumb.

In hour three, I lost the train of thought, and wrote the following simple code to test my understanding of the problem and how it behaved on the simple test data.

def count(t):
    cnt = 0
    if t == '':
        return 1
    for p in patterns:
        if t.startswith(p):
            cnt += count(t[len(p):])
    return cnt

It burns through the simple test data in about 20ms. Of course I tried it on the real data and…

Well, it runs for a long time. In fact, it’s pretty much exponential in the length of the towels, and the large number of patterns doesn’t help either.

But suddenly I heard harps, and angels, and light shown down on me from above, and the seraphim all proclaimed how stupid I was. Ten seconds and a two line change later, I had the answer.

#!/usr/bin/env python

# okay, we can't leverage the re library this time..

data = open("input.txt").read()

patterns, towels = data.split("\n\n")

patterns = patterns.split(", ")

towels = towels.rstrip().split("\n")

from functools import cache

@cache
def count(t):
    cnt = 0
    for p in patterns:
        if t == '':
            return 1
        if t.startswith(p):
            cnt += count(t[len(p):])
    return cnt

total = 0
for t in towels:
    cnt = count(t)
    print(cnt, t)
    total += cnt

print(total)

I think if I had half the knowledge I’ve accumulated over the years, I would have solved this immediately. But instead I solved it 2h46m in, netting me a ranking of just 6847. An opportunity squandered.

Had I stepped back and actually looked at the simplicity of the problem, and remembered the use of caching/memoization (which I considered earlier, but without clarity of thought) I would have seen it for the simple problem it was.

Four decades of programming in, and I’m still learning. And re-learning. And un-learning.

Happy holidays all.

I’m dimwitted, or Day 13 of the Advent of Code challenge…

December 13, 2024 | My Projects | By: Mark VandeWettering

As part of my daily puzzling in December, I’ve been engaged in the Advent of Code 2024 challenge. This is the kind of thing that sane people only do when prepping for job interviews (which I suppose I could be) but I do more for fun, in some hope that I’ll buoy up my ego a bit as well by proving that “I still got it.”

For the first few days, I didn’t even realize that there was a competitive element to this challenge, but it turns out that they keep a kind of leaderboard, and award points for first 100 people who solve a particular 2 part problem. Given that a few tens of thousands of people seem to be engaged in this activity, I have suspected that my chance of scoring even a single point is vanishingly small, but for the last couple of days, I decided to give it a whirl with my best efforts right at 9:00PM when the next days puzzle is released.

You can read the puzzle description for day 13 on their website.

Here is where I went awry, right from the very start. I read this:

The cheapest way to win the prize is by pushing the A button 80 times and the B button 40 times. This would line up the claw along the X axis (because 80*94 + 40*22 – 8400) and along the Y axis (because 80*34 + 40*67 = 5400). Doing this would cost 80*3 tokens for the A presses and 40*1 for the B presses, a total of 280 tokens.

This sent me down a complete rabbit hole, which took me the better part of an embarrassing four hours (and a night’s sleep) to rectify. I had convinced myself that there were potentially multiple solutions to this, and therefore I needed to treat it as a Diophantine equation. And, because I’m that kind of guy, I resorted to playing around with it on that basis using Python’s sympy.solvers.diophantine module.

I eventually solved it using that library, but because it is kind of a complicated module, it took me a lot of time, and had a lot of false starts and rabbit holes.

It’s simpler, way simpler than that. And I’m positively idiotic for considering otherwise.

Each potential move of the crane game arm moves the arm to the right and up. If we call the amounts that the Button A moves a_x, a_y, and for Button B b_x, b_y, then we can think of each button push as a vector. If you think of these as independent vectors, you will realize that it doesn’t matter what order you press the buttons. Let’s say we do all the A buttons first, and all the B buttons. If we end up at the target t_x, t_y after all that, then we win.

All the A steps occur along a line starting at the origin. We can reverse the direction of all the B steps, but begin at the target location. These are lines. The “A” line has slope a_y / a_x, and the B line has slope b_y / b_x. The A line passes through the origin and the B line passes through the target.

Clearly, unless they are coincident, they can only cross at a single point. There is no “optimization” because there can only be a single solution. If they intersect at a point on the integer lattice, then the game has a solution. Doh.

It’s still convenient to use sympy to avoid having to do algebra by hand, and cancel out stuff on paper and transcript it into code, but it’s not rocket science even if you had to do it by hand.

Spoiler: here’s my revised code to do it.

#!/usr/bin/env python

import re

data = """Button A: X+94, Y+34
Button B: X+22, Y+67
Prize: X=8400, Y=5400

Button A: X+26, Y+66
Button B: X+67, Y+21
Prize: X=12748, Y=12176

Button A: X+17, Y+86
Button B: X+84, Y+37
Prize: X=7870, Y=6450

Button A: X+69, Y+23
Button B: X+27, Y+71
Prize: X=18641, Y=10279""".split("\n\n")

data = open("input.txt").read().rstrip().split("\n\n")

def parse_data(d):
    d = d.split("\n")
    m = re.match(r"Button A: X\+(\d+), Y\+(\d+)", d[0])
    ax, ay = m.group(1), m.group(2)
    m = re.match(r"Button B: X\+(\d+), Y\+(\d+)", d[1])
    bx, by = m.group(1), m.group(2)
    m = re.match(r"Prize: X\=(\d+), Y\=(\d+)", d[2])
    tx, ty = m.group(1), m.group(2)
    return int(ax), int(ay), int(bx), int(by), int(tx), int(ty)

coins = 0

from sympy import solve, symbols, Eq

def mysolve(d):
    global coins
    ax, ay, bx, by, tx, ty = parse_data(d)
    tx += 10000000000000
    ty += 10000000000000

    # let sympy do all the heavy lifting

    a, b = symbols("a, b", integer=True)
    eq1 = Eq(a * ax + b * bx, tx)
    eq2 = Eq(a * ay + b * by, ty)
    sol = solve((eq1, eq2))

    if sol != []:
        coins += 3 * sol[a] + sol[b]

for d in data:
    mysolve(d)

print(coins)

If nothing else, it did make me dust off my knowledge of the simple bits of sympy, but I feel like an idiot. Note to self: don’t read too much into the wording of puzzles like this, they may be designed to mislead as much to illuminate.

Happy Holidays to all.

An hour of Meshtastic traffic on the Bay Area Mesh…

December 9, 2024 | My Projects | By: Mark VandeWettering

This will be a bit of a rambling technical ride on a particularly nerdy topic, so buckle up (or bail out now while you still can.)

I’ve been interested in Meshtastic for quite some time. It promises to be a decentralized network that allows users to create a mesh network which is independent of any kind of internet/cell service, and exchange short text messages using inexpensive radios that they access from WiFi or Bluetooth, usually from an application which runs on their cell phone (but which does not utilize the cell network).

The underlying technology is based upon a radio technology called Lora which uses spread-spectrum technology to permit low-power, long range radio communications. Nodes are connected in a mesh network, so that intermediate nodes can forward messages across multiple hops. It uses license-free radio spectrum in the 900Mhz band in the United States.

The hardware that can connect you to this network is pretty cheap, ranging from the low end of just under $10 to about $30 or a bit more for fully integrated systems like the Lilygo T-Deck which includes a keyboard, and means you don’t even need to access it via a cell phone app.

I’ve got a variety of different options, including the Xiao S3 above, Heltec V3 modules and a T1000-E from Seeedstudio.

I live in the SF Bay Area, where a local regional mesh is organized via bayme.sh/. It’s hard to judge how many nodes are active (more on this later) but there are live maps available on the Internet which show that it probably numbers around 250 nodes or so at any given moment. This makes it seem like a pretty active group, and I’ve attended a few live presentations at the Maker Faire and at some of the hacker groups in the area.

But here enters my problem, if you examine the map on the right, you’ll see that there are a lot of nodes down in the flats surrounding Emeryville and Oakland, and even a couple in Concord. But I live in the area labeled “Pinole Valley Watershed”. One thing that I haven’t mentioned is that the frequencies which are used by the Meshtastic network are pretty much strictly line of sight. If there is a clear path (no trees, buildings, or mountains) then the Lora radios can easily travel for kilometers at very low power. But if you have any obstructions, then it becomes impossible to establish any kind of link. And of course that’s presents a problem for me.

I live in one of the many little basins that dot the Pinole Valley Watershed, rather near the northern end of the San Pablo Dam reservoir. Thus establishing a line of site to any of the existing sites is pretty much impossible.

I had a similar problem when I became interested in another wireless network technology, the AREDN network which uses more conventional WiFi hardware on licensed ham radio bands (perhaps I’ll write up more about my experiences with that at some point). But AREDN allows you to “tunnel” traffic across a normal Internet link. This is viewed as suboptimal: clearly one of the purposes of AREDN (and Meshtastic) is to create networks which are independent of any commercial infrastructure. But the AREDN community generally views such things rather pragmatically: it’s hard to get people to invest time, money and effort into creating nodes which extend the range and availability of the network without getting them linked in, and for people like myself, that’s hard to do over RF links. Internet linking allows people in remote “islands” to participate in the network, and can give some incentives to building out the network while growing the expertise and enthusiasm of individuals.

Meshtastic (or perhaps more properly, the Bay Area group) takes a dimmer view of internet linking, even though it is possible. The Meshtastic firmware isn’t a full TCP/IP stack (like AREDN) but it allows messages to be transmitted to an MQTT server, and nodes can even accept messages from such servers and broadcast them over RF. But in general, the latter is entirely discouraged, for a variety of reasons:

  1. It is viewed as “less pure”, and generally just thought of rather dimly.
  2. The Internet clearly has a much higher capacity than the underlying RF links, and so it is possible for an Internet feed to flood messages onto the RF network, making it impossible to exchange messages on the RF local nets.
  3. This can be exacerbated by having lots of poorly configured nodes. Often people configure nodes as ROUTER (gateway) nodes which would be much better configured as CLIENT (edge) nodes.
  4. There is simply a lack of really practical information to help people. While documentation exists, it’s not written in a way which really helps people understand the functioning of the network.

Whew, this is getting long.

On to my current tinkering.

While I am still working on getting some RF nodes working, I thought I might look at statistics of tracking the traffic on the bayme.sh MQTT server. This is the node which feeds the various mapping efforts and the like, and in theory should give me a good view of the traffic that are occurring across the peninsula. To do this, I wanted to write some Python code to subscribe to that server, and decode/aggregate information about that traffic it hears. It was a bit more complicated than I had hoped, but meshtastic does have a Python module which was helpful, and extracting basic information from that, I tinkered the following together:

#!/usr/bin/env python

import sys
import re
from inspect import getmembers
import paho.mqtt.client as mqtt
import json
import base64
import textwrap
from meshtastic.protobuf import mqtt_pb2, mesh_pb2, portnums_pb2, telemetry_pb2
from cryptography.hazmat.primitives.ciphers import Cipher, algorithms, modes
from cryptography.hazmat.backends import default_backend


app_dict = {
        portnums_pb2.ADMIN_APP : 'ADMIN_APP',
        portnums_pb2.AUDIO_APP : 'AUDIO_APP',
        portnums_pb2.DETECTION_SENSOR_APP : 'DETECTION_SENSOR_APP',
        portnums_pb2.IP_TUNNEL_APP : 'IP_TUNNEL_APP',
        portnums_pb2.MAP_REPORT_APP : 'MAP_REPORT_APP',
        portnums_pb2.NEIGHBORINFO_APP : 'NEIGHBORINFO_APP',
        portnums_pb2.NODEINFO_APP : 'NODEINFO_APP',
        portnums_pb2.PAXCOUNTER_APP : 'PAXCOUNTER_APP',
        portnums_pb2.POSITION_APP : 'POSITION_APP',
        portnums_pb2.POWERSTRESS_APP : 'POWERSTRESS_APP',
        portnums_pb2.PRIVATE_APP : 'PRIVATE_APP',
        portnums_pb2.RANGE_TEST_APP : 'RANGE_TEST_APP',
        portnums_pb2.REMOTE_HARDWARE_APP : 'REMOTE_HARDWARE_APP',
        portnums_pb2.REPLY_APP : 'REPLY_APP',
        portnums_pb2.ROUTING_APP : 'ROUTING_APP',
        portnums_pb2.SERIAL_APP : 'SERIAL_APP',
        portnums_pb2.SIMULATOR_APP : 'SIMULATOR_APP',
        portnums_pb2.STORE_FORWARD_APP : 'STORE_FORWARD_APP',
        portnums_pb2.TELEMETRY_APP : 'TELEMETRY_APP',
        portnums_pb2.TEXT_MESSAGE_APP : 'TEXT_MESSAGE_APP',
        portnums_pb2.TEXT_MESSAGE_COMPRESSED_APP : 'TEXT_MESSAGE_COMPRESSED_APP',
        portnums_pb2.TRACEROUTE_APP : 'TRACEROUTE_APP',
        portnums_pb2.UNKNOWN_APP : 'UNKNOWN_APP',
        portnums_pb2.WAYPOINT_APP : 'WAYPOINT_APP',
        portnums_pb2.ZPS_APP : 'ZPS_APP'
    }

# Replace with your actual MQTT broker address, port, and credentials
MQTT_BROKER = "mqtt.bayme.sh"
MQTT_PORT = 1883
MQTT_USERNAME = "meshdev"
MQTT_PASSWORD = "large4cats"
MQTT_TOPIC = "msh/US/bayarea/#"  # Subscribe to all Meshtastic topics

default_key = "1PG7OiApB1nwvP+rz05pAQ==" # AKA AQ==

# Replace with your encryption key (if using encryption)
ENCRYPTION_KEY = b'your_encryption_key'  # 32-byte key (e.g., generated with Fernet.generate_key())


def on_connect(client, userdata, flags, rc, properties=None):
    if rc == 0:
        print(f"CONNECTED")
        client.subscribe(MQTT_TOPIC)

def decode_encrypted(mp):
    try:
        kb = base64.b64decode(default_key.encode("ascii"))
        nonce_packet_id = getattr(mp, "id").to_bytes(8, "little")
        nonce_from_node = getattr(mp, "from").to_bytes(8, "little")
        nonce = nonce_packet_id + nonce_from_node

        cipher = Cipher(algorithms.AES(kb), modes.CTR(nonce), backend=default_backend())
        decryptor = cipher.decryptor()
        db = decryptor.update(getattr(mp, "encrypted")) + decryptor.finalize()
        data = mesh_pb2.Data()
        data.ParseFromString(db)
        mp.decoded.CopyFrom(data)
    except Exception as e:
        print(f"DECRYPT FAILURE: {e}")


port_dict = dict()

def on_message(client, userdata, msg, properties=None):
    global wrapper, port_set
    try:
        se = mqtt_pb2.ServiceEnvelope()
        se.ParseFromString(msg.payload)
        mp = se.packet
        if mp.encrypted:
            decode_encrypted(mp)

        if not mp.HasField("decoded"):
            return

        print(mp)

        port_dict[mp.decoded.portnum] = port_dict.get(mp.decoded.portnum, 0) + 1

        if mp.decoded.portnum == portnums_pb2.TEXT_MESSAGE_APP:
            print("TEXT_MESSAGE_APP")
            try:
                tp = mp.decoded.payload.decode("utf-8")
                for l in str(tp).split("\n"):
                    print(f"    {l}")
            except Exception as e:
                print(f"PROBLEM DECODING TEXT_MESSAGE_APP {e}")
        elif mp.decoded.portnum == portnums_pb2.NODEINFO_APP:
            print("NODEINFO_APP")
            info = mesh_pb2.User()
            try:
                info.ParseFromString(mp.decoded.payload)
                for l in str(info).split("\n"):
                    print(f"    {l}")
            except Exception as e:
                print(f"PROBLEM DECODING NODEINFO_APP {e}")
        elif mp.decoded.portnum == portnums_pb2.POSITION_APP:
            print("POSITION_APP")
            pos = mesh_pb2.Position()
            try:
                pos.ParseFromString(mp.decoded.payload)
                for l in str(pos).split("\n"):
                    print(f"    {l}")
            except Exception as e:
                print(f"PROBLEM DECODING POSITION_APP {e}")
        elif mp.decoded.portnum == portnums_pb2.ROUTING_APP:
            print("ROUTING_APP")
            route = mesh_pb2.Routing()
            try:
                route.ParseFromString(mp.decoded.payload)
                for l in str(route).split("\n"):
                    print(f"    {l}")
            except Exception as e:
                print(f"PROBLEM DECODING ROUTING_APP {e}")
            sys.exit(0)
        elif mp.decoded.portnum == portnums_pb2.TELEMETRY_APP:
            print("TELEMETRY_APP")
            telemetry = telemetry_pb2.Telemetry()
            try:
                telemetry.ParseFromString(mp.decoded.payload)
                for l in str(telemetry).split("\n"):
                    print(f"    {l}")
            except Exception as e:
                print(f"PROBLEM DECODING TELEMETRY_APP {e}")
        # we could do more
        # but for now...
    except json.JSONDecodeError:
        print(f"Error decoding JSON: {msg.payload}")
    except Exception as e:
        print(f"An error occurred: {e}")

    port_dict_view = [ (v, k) for k, v in port_dict.items() ]
    port_dict_view.sort(reverse=True)
    for v, k in port_dict_view:
        print(f"{app_dict[k]} ({k}): {v}")
    print()


wrapper = textwrap.TextWrapper()
wrapper.initial_indent = "    > "


client = mqtt.Client(protocol=mqtt.MQTTv5, callback_api_version=mqtt.CallbackAPIVersion.VERSION2)
client.username_pw_set(MQTT_USERNAME, MQTT_PASSWORD)
client.on_connect = on_connect
client.on_message = on_message

client.connect(MQTT_BROKER, MQTT_PORT, 60)

client.loop_forever()

It’s not perfect, or elegant, but embodies some of the very basics you’d need to receive and decode messages from the MQTT. I let it run for an hour late on Sunday night, and it recorded 791 messages to the MQTT server. They were broken down into six different “applications”, with the following distribution.

NODEINFO_APP (4): 318
TELEMETRY_APP (67): 281
POSITION_APP (3): 148
NEIGHBORINFO_APP (71): 28
MAP_REPORT_APP (73): 13
STORE_FORWARD_APP (65): 3

NODEINFO_APP packets are used to transmit information about individual nodes. Of the 318 sent, 96 distinct nodes were logged. Information for a (randomly chosen) node might look like this:


NODEINFO_APP
    id: "!336a194c"
    long_name: "Capt Amesha"
    short_name: "CA"
    macaddr: "d\3503j\031L"
    hw_model: HELTEC_V3
    role: ROUTER

Scanning the log, it appears that 42 of such nodes are configured as having a role of ROUTER, which seems… interesting.

Further scraping that information yields the following breakdown on types of hardware used:

     94     hw_model: RAK4631
     54     hw_model: HELTEC_V3
     42     hw_model: T_ECHO
     34     hw_model: STATION_G2
     33     hw_model: TBEAM
     12     hw_model: HELTEC_WSL_V3
     10     hw_model: HELTEC_V2_1
      8     hw_model: T_DECK
      8     hw_model: LILYGO_TBEAM_S3_CORE
      6     hw_model: HELTEC_WIRELESS_PAPER
      5     hw_model: TLORA_T3_S3
      4     hw_model: PORTDUINO
      3     hw_model: TRACKER_T1000_E
      2     hw_model: SEEED_XIAO_S3
      2     hw_model: DIY_V1
      1     hw_model: TLORA_V2_1_1P6

Not sure what to make of this data yet, but at least I can gather it and use it to think more about the state of the network and how its configured, including total number of messages sent and the locations of nodes, with the possibility of finding misconfigured nodes.

I probably will continue to tinker with this, most basically by creating an sqlite database to log all this information and allow more general queries over time.

And I will probably work on getting a solar powered node at the top of the hill on my property, in the hope of creating a resource for my neighborhood.

All for now, hope you all are having a good December. Feel free to comment if you have any hints/suggestions.

Another 3D stamp: QR codes

November 20, 2024 | My Projects | By: Mark VandeWettering

After tinkering with making a 3D stamp yesterday, I thought that maybe I would tinker together a stamp for the QR code that would send you to my resume-ish site mvandewettering.com. I had used the qrcode library in python to generate them before, but it wasn’t clear to me how to use that to generate a vector format that would merge well with my 3d printing workflow. In the end, I decided to just generate the image using the qrcode library, and then construct a series of rectangular prisms for every location which was dark. I thought it would be rather simpler to just write out the individual polygons in ASCII STL format directly. STL can only directly handle triangles, so I output 2 triangles per face, resulting in 12 triangles for each rectangular prism. The main code looks pretty straightforward.

with open("qrcode.stl", "w") as f:
    print("solid qrcode", file=f)
    for y in range(img._img.height):
        s = None
        for x in range(img._img.width):
            p = img._img.getpixel((x, y))
            if p == 0:
                if s is None:
                    s = x
            elif p == 255:
                if not s is None:
                    output_rect(s, x, y, y+1, f)
                    s = None
        if not s is None:
            output_rect(s, x, y, y+1, f)
    output_rect(0., img._img.width, 0., img._img.height, f, minz=-1, maxz=0)
    print("endsolid qrcode", file=f)

The output_rect code is tedious, but not especially complicated.

def output_rect(minx, maxx, miny, maxy, f, minz=0, maxz=1):
    print("facet normal 0 0 1", file=f)
    print("outer loop", file=f)
    print(f"vertex {minx} {miny} {maxz}", file=f)
    print(f"vertex {maxx} {miny} {maxz}", file=f)
    print(f"vertex {maxx} {maxy} {maxz}", file=f)
    print("endloop", file=f)
    print("endfacet", file=f)

    print("facet normal 0 0 1", file=f)
    print("outer loop", file=f)
    print(f"vertex {minx} {miny} {maxz}", file=f)
    print(f"vertex {maxx} {maxy} {maxz}", file=f)
    print(f"vertex {minx} {maxy} {maxz}", file=f)
    print("endloop", file=f)
    print("endfacet", file=f)

    # ... five more similar copies of the code to handle the other five faces

It took me a few trials to get the triangle ordering consistent and the normals specified. It’s not clear to me from any documentation that I had what “proper” means, but I found that specifying vertices in counter-clockwise order as viewed from the outside seemed to work well. Early versions confused CURA substantially, even when it loaded it properly. But eventually I got the following image, colored yellow which is apparently what CURA uses to indicate the model is “good”. I made sure to flip the image left to right so that I could use it as a stamp.

I sized it to about 1.5 inches, and printed it again in PLA. I gave it a light sand with 220 sandpaper. I haven’t printed a proper handle for this so I plunked it down on an ink pad and just pressed it into place.

Yeah, it isn’t quite good enough to actually work. I think a little additional sanding might help, as well as just getting a better inkpad. I am also wondering whether giving a small cylindrical deformation would make it easier to ink, as the pressure would be concentrated. I’ll tinker with this some more when I’ve had coffee. I suspect that benefitting from some other people’s experience in doing this would be good, so some youtube viewing is probably in my future. I also want to try using TPU instead of PLA to see how that would work as well.

I also could try to use a qrcode mode which is larger/has higher amounts of error correction. But I kind of want to keep this reasonably small to make it convenient and easy to use.

Hope you all are having a good day!

3D printing an ink stamp, or “Welcome to Stampy Town, Population Five!”

November 19, 2024 | 3D printing | By: Mark VandeWettering

Apologies to Hermes Conrad.

Further apologies to those who won’t get this Futurama quote.

During COVID, I spent some time in my shop doing more woodworking. At the time I was trying to figure out how I could sign the work that I did, mostly for fun rather than ego (my woodworking skills remain modest at best.) I had read a number of articles online where people designed logos using 3D printing, and had them printed in metal, which I thought was pretty cool. At the time I used OpenSCAD to design a simple logo of my initials, and sent it off to China for 3D printing. Several weeks later I got it back. It was one inch in diameter, and had a very simple version of my initials.

I had originally cast it with a post on the back, with the idea that I would use a die to cut a screw thread on it, and then I would make a holder for it. I found that the sintered metal that I chose didn’t hold the thread very well, so to use it I actually just hold it in some pliers and heat it with a torch. It takes a bit of practice to get the heat just right, but it has worked pretty well. I did find that sanding the front surface helped a bit, but the sharpness is good and I was overall pleased with it. I have thought about submitting a revised version with the screw thread modeled directly into the metal, but haven’t gotten back to doing that, and it is a pretty low priority.

But in the mean time, I seem to have lost the original OpenSCAD file that I used to generate the model. Yesterday I thought I should try to recreate it. Rather than using OpenSCAD again, I thought it would be better to use OnShape, which has become my go-to for designing objects for 3D printing. I began by inking the original metal stamp using an ink pad, and stamping it onto some paper and scanning it. I then loaded that as a reference into SCAD and used it to take some measurements and reconstruct the outlines. It took me about twenty minutes to come up with the model for it.

The new model is pretty close to the original, but includes a number of chamfers and fillets that were not part of the original. I went ahead and 3D printed the disk in the PLA I had loaded, and was pretty happy with the quality, even though I used 0.2mm layer height (the “fast” presets in Cura).

I had originally thought that I should print this in TPU, which would have been a more flexible filament, and therefore which I thought would more closely match the hard rubber that is commonly used for stamps. A little bit of reading suggested that PLA might be a better choice, as it is easier to get detail without stringing.

I needed a handle to help hold this disk. Since I had OnShape fired up, I went ahead and made a quick little handle that I could use that would hold the disk centered. This was the first time I had used sketched curve profiles and the like, and rather than making it as a full surface of revolution, I chose to flatten both sides. This has two purposes: it makes it easier to print without any supports, and it allows you to align and orient the stamp better to make sure its at the angle you desire.

I printed this in the same PLA. I then took some cheap super glue and put the disk in place, trying to orient it the best I could. In version 2, I will probably print some registration bumps to align it even better, but this was a first test.

Initial attempts were pretty spotty, but I got some #220 sandpaper and gave it a bit of work to make it more level. I do wonder if maybe hitting it with some filler primer and then sanding it down might be a good approach, but a few minutes of work, and it improved to the point where I consider the effort a success.

It’s worth experimenting with, and (if you have a 3D printer) costs very nearly nothing. I’ve been thinking of experimenting more with using 3D printing to do other kinds of prints, but this is a good start.

Hope you all are having a good day!

Felines win in the battle between astrophotography and cats…

November 13, 2024 | My Projects | By: Mark VandeWettering

I posted this picture of my little friend Patchouli to the Facebook Seestar group, who decided to settle into the case for that smart telescope.

She got way more hearts and comments than any of the astrophotographs that I posted to the same group over the last few months. I guess I should not be surprised: she’s positively adorable with her little pink nose and pink paws. Perhaps I should abandon my nerdy and scientific endeavors if my goal is to “drive engagement” or “build a community.”

Okay, that’s not going to happen, but it does perhaps lend some perspective to the world. And frankly, it gives me a little bit of hope.

Strategies for coping with problems…

November 9, 2024 | My Projects | By: Mark VandeWettering

I’ve found that there are three basic strategies that have helped me in the past. They are probably not comprehensive, or even the best, but they are pretty simple to remember, and cover more situations than you might think.

I categorize them as Plan, Act, and Ignore.

Perhaps the most productive and generally the best is to plan. You see that in the future there is some issue that you will be facing, and you develop a plan so that when the anticipated outcome happens, you already know what you will do and what the likely outcome will be. This is good because it can help stifle the anxiety of uncertainty. If the hurricane strikes, you know you have food, a backup generator, and an evacuation plan. You put away savings for a rainy day. You perform maintenance on your house and your car. Then, when these things happen, you don’t need to develop a plan at the last minute. You know what you will do, and what the likely outcome will be. You’ve worked to minimize stress and danger to yourself.

The problem with planning as a strategy is that it is predicated on you actually understanding the problem and its likely probability. The world is very complicated, and it is difficult to balance all the possibilities, and develop plans to cover all contingency. You prepared for a hurricane. But a lot of the damage from the recent hurricane Milton was caused by the tornades that struck ahead of the actual hurricane. If you plan for something, you may be ignoring some other risk that turns out to be somewhat surprising. Planning is most effective for the predictable risks in life. Spending a lot of time planning for low-probability or unforeseen risks or events can be pointless and exhausting.

So, the second strategy: you act. In this case, something unforeseen or even unforeseeable has happened, and you need to do something. In cases like this, you may not have a plan, or at least not a complete plan. You need to rely on your resources (intellectual, financial and emotional) to find a course which minimizes damage to yourself and those who are important to you in light of new situations and information. Reusing the previous example: perhaps the hurricane course shifts in the last 24 hours, and your planned place of evacuation is no longer safe. If you are lucky, you may have foreseen this possibility and know several alternatives, or can quickly search for alternatives. When you act, you quickly draw on the best information you have, and chart the best course that you can see as quickly as you can.

I originally thought of this strategy as reaction. But in trying to clarify my own thinking about this, I found that the term implied the kind of thoughtlessness implied by the phrase “knee jerk reaction”. One of my personal mantras is “act, don’t react”. Reaction is the lizard brain attempt to cope with problems, with little analysis or conscious thought. Reaction is the “flight or flight” response that we have, which admittedly is often an effective survival strategy. I don’t mean to denigrate it, in fact, it often can save your life. But if you have a moment, it is usually good to ask “am I just reacting to new information, without actually understanding it or considering it, and do I have time (even limited time) to consider a different course of action?” If so, then action may be the strategy that makes sense.

Lastly, you could simple choose to ignore the problem. This sounds bad. Ignoring problems means they don’t get solved, and unsolved problems can pile up and cause you greater problems in the future. This doesn’t seem like a strategy at all.

But the thing is that the human mind (and certainly my own mind) has a near infinite capability for worry. Worry and stress can have significant negative effects on your body and well being. All this planning and action takes significant energy and resources, and can keep you from relaxing or enjoying what’s going on. Ignoring problems can be a valuable skill, particularly when the problem is not amenable to either planning or action.

As a for instance, years ago my mother was in failing health. I knew that she was going to die within months. I had long term plans for how I was going to cope with the financial practicalities of her care. I also took regular actions to call her daily and to fly up to visit her at regular intervals. But I had to cope with the fact that she’d have good days, and long days. There were times when I was called to act when she had a particularly bad turn.

But no amount of planning or action was going to prevent or even delay the course of her illness. I was concerned about her every day. If I had allowed myself, I could have been concerned every minute of every day. So I adopted a different strategy: I chose to ignore the problem.

That sounds bad, so let me explain my process. My internal monologue basically was “Mom is ill, and will probably die at some point. Is there any plan I could engage in that will stop or delay this negative outcome? (No.) Is there any action that I could take now that will help? (No.) Then I am going to choose to ignore this problem. I am going create a mental closet that holds this problem. I am going to take the worry and angst that I feel, and the ineffectiveness that I feel in not being able to fix things, and I am going to put that problem in the closet. Most importantly, I am going to reopen this mental closet at some specific time in the future (tomorrow, Monday, a week from now) and open that mental door, and relook at this problem, and decide whether there is some additional plan or action could be helpful. Perhaps I will choose to put the problem back in the closet again. Perhaps some new information or change in the situation will make a new plan or action beneficial, and will provide me with a new course of action.”

I don’t believe that you should ignore problems indefinitely. My internal conscience and basic decency would not have allowed me to abandon my mom and simply stop thinking about her. But the ability to carve out some space in my life when I am not pointlessly and unproductively worrying about her was essential to my mental health, if not my survival. One needs to have space to experience all emotions, not just the flight or flight responses that we’ve evolved.

At first, this “mental closet” seems really difficult. But I found that practice, and the assurance that I wasn’t just ignoring problems, that I was postponing worry to gather space and strength, and would address them more productively if my situation changed made it easier as time goes on. In fact, the ignore strategy is just a variation of the plan strategy.

A great number of my friends and family are experiencing angst as the result of the uncertain political climate, as well as numerous other more personal changes in their lives. I get it. I’m right there with you. I decided to write this post mostly to clarify my own thinking. I suspect I’ll be dusting off these strategies with greater frequency in the coming weeks, months, and years. But while I may have been motivated by my concern about the political situation here in the United States, they apply to many sorts of problems. I do not claim them to be particularly universal, but I offer them in the hopes that you might find some part of them useful or encouraging in your own lives.

And remember: when the ship is sinking, put on your life jacket first. If you don’t care for yourself, you won’t be able to care for others. Self care is part of caring for others.

Best wishes, and be kind to yourself and others.

Practical uses for 3D printers…

November 9, 2024 | 3D printing | By: Mark VandeWettering

I’ve been an on-and-off enthusiast for 3D printing for quite some time, but in the early days, it wasn’t what I would call “practical”. They used to be fairly unreliable. In particular, my aging Creality CR-10 had difficulties with bed leveling, and while I kept modifying it to add sensors like the BL-Touch to automate that process, at some point I simply got fed up with it and let it sit. But technology kept increasing, and there are some new consumer level printers made by companies like Bambu Labs which are faster and employ lidar and other fancy bits of tech to print faster than ever before.

As yet, I don’t have one of those. But rather than continue to tweak my CR10, I decided last year to buy a Elegoo Neptune Pro 3, which I got on a Black Friday sale for around $250. And it was much better than my Creality CR10. It’s bed leveling just works, and I’ve done dozens of prints with the only real failures being stupidities of my own. And the quality is quite good.

And, it’s reliable enough that I can design and print parts without taking an entire day. For instance this week I had an issue where I wanted to fix a window that the previous owner had literally screwed shut (presumably as a security measure). I needed to open that window today for maintenance, but he had driven these self tapping screws in very close to the edge, and I couldn’t get a wrench or even a nut driver in very easily to take them out. I didn’t want to replace them in the same way, so instead I designed and printed these little clips using OnShape, a free and web based parametric design software that I recommend to anyone. It may not be quite as capable as Fusion360, but the ability to design parts using any web browser (I use Chrome on both Windows and Linux machines) and have models always available is pretty handy.

Anyway, I designed this part with chamfers and better clearances after taking a few measurements, and printed them in some ASA filament I had lying around, which should be more UV resistant than others. These look tidier and were less annoying than just screwing through the window frame. And should I ever need to back them out, there is enough clearance to just use a nut driver to back them out. I printed four of them in just 25 minutes, taking almost no filament. Problem solved.

Another thing I like to use 3D printing for is to make things like lens caps. I have this old pair of German WWII aircraft spotting binoculars.

They are beasts, but very comfortable to use for astronomy, with large eye relief, adjustable inter-ocular distanc, and a sturdy tripod. I often use them to view lunar eclipses and the like. But what they lack is dust caps. So, the other day I dusted off my OnShape skills and took some measurements, and quickly generated this lens cap model:

I had some white TPU filament, which is quite flexible and which I have used to make a dustcap previously for my 6″ f/4 Newtonian that I made decades ago. The model is a very simple capped cylinder, with a chamfered rim around the edge to add some thickness for enhanced sturdiness. The chamfer also makes them slide on very easily, even though the fit is fairly tight. That means that they don’t fall off very easily either. The TPU is flexible, and even though the thickness is only 1mm, they are incredibly sturdy: I think I would actually have to work fairly hard to tear them apart. I should note that TPU was hard to print on my old CR10, but I’ve had literally no failed prints using TPU, despite it being a very soft and flexible filament.

I mentioned this model to a friend of mine who said that he had long ago lost the caps for his pair of Nikon binoculars. I told him to send me the dimensions and I’d print him some. He lives up on Oregon, so I mailed them up to him, and he mailed me back this photo:

They apparently work perfectly. I didn’t have TPU in black, which would have looked nicer, but hey, they work and will be good at protecting his optics.

3D printing can be really valuable in creating custom items, even if (or maybe especially if) they are low value objects. Think about it this way: how much would you pay to get a new set of lens caps for a set of binoculars? Even $5 seems excessive, but you might do it if you knew they would fit your very special binoculars. But such a thing may not even exist/be available if your particular set is old or rare. Being able to create a version which actually fits for just pennies seems really cool to me.

None of this is very exciting, but I do feel oddly happy having done this.

I still need to take some measurements to do caps for the German binocular’s eyepieces, which have a tapered shape which is a bit more complex. I’ll probably get to that today.

Hope you all are having a good weekend.

Astrophotography for those without huge budgets of time or money…

November 8, 2024 | Astronomy, My Projects, Telescopes | By: Mark VandeWettering

So, in an effort to get back to blogging about things that may not matter in the grand scheme of things, but which provide some measure of joy to me, I present something that I have not blogged about, but which might be of interest to others: my astrophotography tinking using a nifty gadget: the Seestar S50. This little gadget is a highly portable, fully contained robotic astronomy camera that you access and control via a smart phone app (I am using Android these days, using a Google Pixel 6). It has a 50mm aperture (which doesn’t sound like a lot to some like myself who has built telescopes with 12in aperture, and helped refurbish even larger ones) but it’s a very neatly implemented system, and can be used to take not only deep sky projects, but also the moon, the sun (with an included solar filter) and even terrestrial objects such as birds.

The Seestar is a sophisticated (but simple to use) device which works by automatically aiming the telescope at the object of interest and taking multiple short images (10s by default) and then aligning and then adding them up. In 10s, the number of photons collected may be quite small, but over time they add up, and you get clearer and clearer images of the dim deep sky objects. It’s really quite remarkable how well this works. And yet it’s quite easy to use: I took more astrophotos in the first day I had it than I had done in my entire life (admittedly, that’s not a lot, but it’s not zero either) and with better results.

I have posted a few of these pictures on my Mastodon or Facebook feeds, but haven’t posted any here before. I’ll probably pick out some of the backlog shortly and rehost them here, but instead I’ll show you one of the new features that was enabled in a recent app update: Mosaic Mode.

People don’t realize that some objects in the night sky are actually quite large because they are dim, often below what you can see with the naked eye from most locations. The most obvious example is the Andromeda Galaxy, whose apparent angular size is about three degrees, which is about six times the diameter of the full moon. The Seestar’s field of view is about one degree, which means that you can’t get a single picture. But in a recent software update, they added “mosaic mode” which allows you to frame an object that is too large to fit in a single image, and automatically scan over it to assemble an image with a much wider field of view. Last night I tried this for the first time from my home location, a largely suburban location with “Bortle 5” skies. Andromeda isn’t even faintly visible to the naked eye from this location. But the Seestar had no problems locating the object, and once I configured it with the framing that I liked, I set it to go.

After 88 minutes of exposing, this is the image as it appeared from my cell phone. No processing has been done by me, except to rate it into landscape mode.

M 31

The Seestar will also store a FITS format image, along with the JPG image. The FITS image has greater bitdepth, and allows for more aggressive processing. I use the free software Siril to process astronomy images, and with a little bit of pushing and some additional cropping, I turned the above into this:

Above and to the left of the main galactic core you can see M32, and below it the small elliptical galaxy M110. Both are satellite galaxies of M31. To the left of the core of the galaxy is a faint brightining in one of the spiral arms of M31. This is a star forming region designated NGC 205.

It’s astonishing how easy this is, and how (relatively) inexpensive. The Seestar S50 has a list price of around $500, and this required no manual tracking, no staring through an eyepiece, literally just selecting the object using an app, and the gadget does all the hard work.

I’ve used it to image the sun, moon, and numerous deep sky objects. It can image planets like Jupiter and Saturn, but it actually isn’t the best for those (it lacks sufficient focal length to get good detail). But I’ve used it to take pictures of comets, minor planets and even a supernova in a distant spiral galaxy. It’s tons of fun.

Expect more about it in the future. Feel free to add questions or even make requests of things to image.

Hope you all are having a good day.

The arc of the moral universe is long, but it bends towards justice…

November 6, 2024 | My Projects | By: Mark VandeWettering

Or does it?

It’s not like gravity attracting bodies together. It’s not a force of nature. It’s something that we all have to work toward together, because the moral universe is something we can only create collectively. Prosperity does not come at the expense of others. Freedom does not come when we deny it to others.

Life need not be a zero sum game.

I turned sixty this year. The Social Security longevity tables predicts that I’ll live to eighty two. I keep hoping that I’ll see some sign of that arc to trend toward justice, and while I think there have been some positives, I can’t help but feel like there have been at least as many downturns.

I don’t want to be the negative guy. But I also want to be the guy who looks at a world where prejudice, racism and sexism are fading anachronisms, not tools for political success. I would like to see people receive the health care they need, and kids receive food and education. I’d like to see people marry who they love, be recognized for who they are, and where they can make decisions about their own bodies.

The great tragedy of all this is that the path that the American electorate has put all of us on will not make groceries more affordable, or ensure prosperity or health. It will do the opposite. The notion that government is the enemy will become a self-fulfilling prophecy, and will be devilishly hard to reverse.

I don’t think that I will live to see it.

I’m exhausted by hoping for it. I’m exhausted by disappointment.

I do not want to write with skepticism or depression. But that is where i am at today, and where I have been for the better part of the last decade. Just a long, straight road, leading toward darkness, with no way to turn around, and no exits.

I hope tomorrow I feel better. After getting laid off from the job I’ve proudly and happily done for over three decades, I’m trying to figure out what my third act will be. I would like to think that there is something that will grant me some relief from the malaise that has been the dominant theme of this year.

Even as I write this, I can’t help but feel that it is overly self indulgent. There have been many things this year that were awesome. My sixthieth birthday aboard a cruise with my wife, my sister, sister-in-law, and best friend was great. I traveled to Mexico to see a total eclipse, a sight which was so moving it left me speechless. I’ll soon be travelling to see my son and his family for Thanksgiving. Both Carmen and I are pretty healthy (if somewhat prone to anxiety eating).

If someone has some optimism they’d like to share (or a job or project that you think I might find fulfilling) feel free to reach out to me at mvandewettering at gmail.com or via Facebook or Mastodon. And frankly, if there is something that you think I can do for you, either encouragement, knowledge or what little wisdom I possess, also feel free to reach out.

I’ve got a fair amount of free time at the moment.

Mom’s Pickled Salmon Recipe

June 5, 2024 | My Projects | By: Mark VandeWettering

When I was a young, one of the things that frequently had was pickled salmon. Sadly, it was also a recipe that I never bothered to learn, and which my sister and I had thought was lost to time in forgetfulness when she found this recipe hand written in one of her inherited canning books.

Transcribed here:

Pack filleted salmon in plain salt, and wait about 2 weeks, salmon should be stiff. Peel the skin from salmon bottom to top. Slice in 1/4 inch strips. Rinse off excess salt @ 30 minutes. 4 big bay leaf, sliced onion @ 8 allspace, fish layered till jar is full except bay leaf, end w/onion layer. Cover w/ cider vingegar with 1 tbsp sugar per quart. Let it sit fridge as long as you can stand (about 1 day at least).

Some recollections/clarifications of my own. Obviously the initial 2 week salting should be done in the fridge. I was somewhat surprised to find the brine is really just cider vinegar with bay leaf and allspice, I always figured it was more complex. It doesn’t say how much fish to start with, or how much you can expect to use per quart. My recollection is that the amount of onions and salmon are roughly equal. My recollections is to use white onions, although I suspect that yellow sweet onions would also be fine, sliced lengthwise into strips. The way that we used to eat this was basically in a simple sandwich, where you took a single slice of white sandwich bread, then fish out some slices of onion and salmon, fold the bread like a taco and munch.

It’s probably not for everyone, but it will forever remind me of mom and grandma.

April 8, 2024 Total Solar Eclipse From Mazatlan

April 21, 2024 | My Projects | By: Mark VandeWettering

Annoying: the videos which I inserted in here late last night seemed to not be working this morning. Granted it was late and my COVID soaked brain may not be working at full efficiency and I haven’t done this in a while but… I’ll get it sorted out later today.

It’s been sometime since I made any update to my blog. I keep thinking I’m going to restart, but then it delays. Sigh. But some events do occur which make me think that at the very least I should write up some notes, and the April 8th total solar eclipse was one of them.

An old and dear friend of mine, Jeff and I started planning this trip back in August of last year. Originally we had conceived of traveling to Texas, but research indicated that if we tried wanted the absolute best chance of seeing the eclipse, historically Mazatlan, in Sinaloa Mexico was going to be the better choice. It was, however, neither cheap nor convenient. We could not find a reasonably (sub $3K) that would fly us directly to Mazatlan from anywhere reasonable, so we did a crazy flight which involved Jeff driving to meet up with me at my home, then flying OAK->LAX. We ended up spending the night in LAX, then flying learly from LAX to Mexico City, an 8 hour lay over then flying from Mexico City to Durango, where we got in late at night and ended up renting another hotel. In the morning, we drove from Durango to Mazatlan. We had originally reserved two rooms for four nights, but as it happened our return flight (which was our departure just in reverse, Durango->Mexico City->LAX->Oakland, but all in one day) was leaving at 6:00AM, so we had to leave a night early. We ended up convincing our hotel not to charge us for the extra night, and got a separate hotel in Durango. We thought that our hotel in Mazatlan was going to be a single kingsize bed, so we each got a room, but as it happens, our suites were a king+double and we could have easily just used one room. Oh well. It wasn’t cheap, but we did all the traveling outbound without significant problem. Our 3.5 hour drive from Durango to Mazatlan was via a toll road, and was both fast and efficient. The only true excitement was our spotting of a cluster of small puppies (“wild chihuahuas!) that came across the road. They were cute, but I was busy driving and didn’t get any pictures.

Jeff and I each brought an SLR with appropriate filtration. Mine was a Canon T5i that I had purchased used, a snazzy solar filter that clipped on magnetically, and a “Heliofind” automatic solar tracker. The notion of the tracker was that I wanted it to automatically track the sun, and therefore would free me from the problem of actually watching the camera and adjusting it. My idea was to automate all the exposures using a program called “BackyardEOS” because Jeff had previously used the Nikon version during the 2017 eclipse that he viewed from Madras, Oregon. I had purchased an appropriate 15ft Mini-USB cable, and had done some previous experiments. As a backup plan, I had experimented with adjusting exposures manually, and tripping it with an inexpensive intervalometer. I had done tried this before during the October 2023 annular eclipse that we did as a dry run/practice. (I should really get those pictures here too).

But during our couple of days in the windup to the eclipse, I did some additional testing in our hotel room, and one thing became obvious: that BackyardEOS wasn’t really designed for eclipse photography. In particular, it had no idea what time the eclipse was, or even what time it was. If I wanted to preprogram a “plan” for the eclipse, I’d have to set it up, and test it manually/repeatedly. We experienced some situations where the software got ahead of the cameras ability to keep up, and then would lock up, which I thought would be stressful at minimum and disasterous at worst. So, I sought another situation.

I decided to use the program SETnC for my windows laptop. https://robertnufer.ch/06_computing/setnc/setnc_page.htm

It had a number of advantages and gave me some confidence that it might work better. It was designed specifically for eclipses, and had the data from the April 8th eclipse. Once I entered our latitude and longitude, it determined the exact time for our local circumstances. I then set it up to take a set of three exposures every five minutes during the partial phase, then about eight seconds ahead of second contact, to eight seconds after, it would snap a picture every second to catch “Baily’s Beads” and “the diamond ring”, and during the 4 minutes of totality, it would cycle with images from 1/1000 to 4 seconds. We bracket these exposures so long in an attempt to catch both details of the prominences, as well as details of the corona, and even (potentially) earth shine. I had originally intended to shoot these in RAW+JPG mode, but it was clear that my aging camera couldn’t keep up with my desired pace. With some reluctance, I set the camera to capture merely JPG pictures. In retrospect, I wonder if part of my poor performance is really due to the relatively pedestrian write speed of my budget SD cards.

Note to self: before next eclipse, do more extensive testing of write speeds of better cards, to see if I can do raw mode with a better card.

All photos were shot with a basic 75-300mm telephoto (about $150 new) at f/8 and ISO 100.

Or, at least that was my intention. I had two small problems:

Note to self: setting the ISO mode was tricky. On the day of, the first few minutes during partial eclipse set ISO to AUTO instead of 100. This was probably undesirable, and made the exposures rather hard to predict, and many of those photos seemed to be overexposed. It’s better to leave fewer decisions to the automatic camera settings. Make sure that ISO is set properly.

Additional note to self: I didn’t actually set the zoom to the full 300mm of the camera, despite that being my attention. I suspect that this was because I shot some quick test shots of the beach at a more modest zoom setting (230mm) and then never reset the camera. The extra 25% image scale would have been good to have.

Another note which I thought was odd: the SETnC program doesn’t understand local time zones. You have to set your laptop to be in UTC or it won’t do the right thing. This was less than entirely convenient, but once I realized that, it wasn’t hard to get it to do do what I wanted.

I did some test runs the day before, and had increasing confidence that it might do something good. It was exciting.

But the weather forecasts were… not promising. The weather maps indicated a band of clouds very closely followed the track of totality. We decided that on the morning of the 8th, we’d get up early and decide if we wanted to drive out of town, or risk it out near the beach. I was up at around 4:00am, couldn’t get back to sleep. We had arranged to meet with Jonathan (a geocaching acquaintance of Jeff’s) at 7:00 to make the final determination.

We had some high clouds that ranged from “very thin” to “troublingly dense”. We weren’t sure what was going to happen, but decided that it was probably no more likely to get better circumstances within an hour of driving, and there would be additional risks. We decided to setup at our hotel. About 9:00am, I headed down to scout.

Our hotel (the Palms Resort of Mazatlan) had been a pretty lively “party hotel” for saturday and sunday, but this was Monday, and seemed to be a bit more calm. We had a couple of places on the pool deck that looked like it could have been okay, but we instead decided to shift to the adjacent hotel’s pool deck, and set up.

I began to get hopeful. While there were still high clouds, they didn’t appear to be too dense. When partiality began, I had my laptop ready, my mount was tracking, and I had focused the best I could. (I did manual focusing, as I was not sure the autofocus would actually do better). I had the computer setup, but also rigged up the intervalometer/remote camera release. I was pleased to find that even while the computer was in control of exposures, I could also trigger the shutter by hand. I wasn’t certain that would work.

Here I am with 15 minutes to go:

Once the partial phase had begun, i had three issues:

First, the Auto ISO issue I mentioned above. I had temporarily paused the automatic mode of SETnC, did a tweak, and then set it running again. Oddly, it then reran all the events which had occurred up to the current time, but then seemed to be acquiring the new photos in the right node. No harm, no foul.

Secondly, I did manage to get the software into its “test” mode. In test mode, it advances the clock to the time just five seconds before the next “event”. This is helpful when you are testing the day before, but it was somehow triggered accidently, probably because it was hard to read the screen of my laptop in the son.

Lastly, when I took it back out of “test” mode, for some reason it informed me that it wouldn’t do any additional partial phase photos for 8 minutes. This was because in test mode it had thought it was 8 minutes later, and so those things were “done”. This is where my intervalometer/camera release came in handy. I just snapped individual photos at more or less random intervals until the software plan caught up to “real” time.

There continued to be high clouds, but through our mylar glasses, would continued to be able to see the clear partial phases. Here is a (lightly) post-processed image of the partial phase, showing the most prominent sunspot.

Jeff had setup his Gopro beneath his camera tripod, aimed out at the ocean and later uploaded this footage of the entirety of totality (or is that the totality of entirety?) In real time, it’s hard to see the upcoming lunar shadow (approaching at something like 1500mph) but if you scrub through it you can see it pretty clearly.

https://youtu.be/_uvY844okfc?si=xZSppEzsWDOzz2k4

As the countdown got closer, the quality of the light got strange, and then dimmer. At about 12m45s into the video, you can hear me call out that “it’s going!” and then around 13m10s, totality begins.

My camera setup worked really well. I shot 410 photos overall. Here is the best of the best, cropped from their originals, but processed only very minimally.

I had time to record some video of myself. Pardon my language in the first little bit. I didn’t think my Google Pro 6 would do a good job of recording the eclipse, so instead I just recorded a selfie of myself, talking about what I was seeing. I must admit: I was oddly emotional. I’m not the kind of guy who never cries, but neither is it a common occurrence. In the background you can hear the voice of an opera singer, who was standing near by and decided to sing. It was amazing. It’s hard to describe the actual appearance of totality. The combination of the super-bright “bailies beads”, with the halo of the corona against the dark sky, the appearance of Venus and Jupiter. It was indescribable.

And then, four minutes later, it was over. I was enormously excited to get back to the hotel room to see how the pictures turned out. I was enormously pleased. WIthin an hour I had my first photo up on Facebook, and it appeared that I may have had one of the earliest photos, and while the pictures weren’t the most astounding technically, I was pretty damned happy and proud that they had worked out. Pretty awesome for a first time eclipse photographer.

We had a blast. It was great to spend time with my friend Jeff, and my new friend Jonathan. We ate a lot of Mexican food, and enjoyed ourselves throroughly. We both caught COVID on the way back, which accounts for some of why this account is a bit late, but it was totally something that ticks my bucket list. Thanks to Jeff for being my stalwart friend and traveling companion, and I urge anyone who can get in the path of totality to try to do it.

Truly fucking amazing.

Restarting the brainwagon blog?

July 18, 2023 | My Projects | By: Mark VandeWettering

I wonder if i trained a large language model on the contents of this blog and used it to generate new posts, whether it would generate interesting enough stuff to at least shame me into creating new posts?

This would require that I actually learn something about this topic at least. Although it probably would also require some hardware that I currently don’t possess.

brainwagon is 20 years old today!

July 21, 2022 | My Projects | By: Mark VandeWettering

It was twenty years ago today that I first posted something to my brainwagon blog. While I have sort of fallen out of the habit of posting to this site, it still remains as an testament to my inability to concentrate on a single topic for more than a couple of days. I keep thinking that I should stop posting to Quora, and should instead refocus my efforts to the sorts of things that I used to routinely blog about, but I haven’t quite gotten back into it. It’s not that I have stopped doing nerdy things. I still am doing woodworking. I want to get back to rebuilding my first telescope. And I’ve spent more than a little time building a “homelab” computing setup. But I haven’t mustered the degree of concentration and the sense of community that used to drive me to blather on inside these pages.

Oh, and I have been caring for stray cats too.

I hope all of you are well.