Author Archives: Mark VandeWettering

April 8, 2024 Total Solar Eclipse From Mazatlan

Annoying: the videos which I inserted in here late last night seemed to not be working this morning. Granted it was late and my COVID soaked brain may not be working at full efficiency and I haven’t done this in a while but… I’ll get it sorted out later today.

It’s been sometime since I made any update to my blog. I keep thinking I’m going to restart, but then it delays. Sigh. But some events do occur which make me think that at the very least I should write up some notes, and the April 8th total solar eclipse was one of them.

An old and dear friend of mine, Jeff and I started planning this trip back in August of last year. Originally we had conceived of traveling to Texas, but research indicated that if we tried wanted the absolute best chance of seeing the eclipse, historically Mazatlan, in Sinaloa Mexico was going to be the better choice. It was, however, neither cheap nor convenient. We could not find a reasonably (sub $3K) that would fly us directly to Mazatlan from anywhere reasonable, so we did a crazy flight which involved Jeff driving to meet up with me at my home, then flying OAK->LAX. We ended up spending the night in LAX, then flying learly from LAX to Mexico City, an 8 hour lay over then flying from Mexico City to Durango, where we got in late at night and ended up renting another hotel. In the morning, we drove from Durango to Mazatlan. We had originally reserved two rooms for four nights, but as it happened our return flight (which was our departure just in reverse, Durango->Mexico City->LAX->Oakland, but all in one day) was leaving at 6:00AM, so we had to leave a night early. We ended up convincing our hotel not to charge us for the extra night, and got a separate hotel in Durango. We thought that our hotel in Mazatlan was going to be a single kingsize bed, so we each got a room, but as it happens, our suites were a king+double and we could have easily just used one room. Oh well. It wasn’t cheap, but we did all the traveling outbound without significant problem. Our 3.5 hour drive from Durango to Mazatlan was via a toll road, and was both fast and efficient. The only true excitement was our spotting of a cluster of small puppies (“wild chihuahuas!) that came across the road. They were cute, but I was busy driving and didn’t get any pictures.

Jeff and I each brought an SLR with appropriate filtration. Mine was a Canon T5i that I had purchased used, a snazzy solar filter that clipped on magnetically, and a “Heliofind” automatic solar tracker. The notion of the tracker was that I wanted it to automatically track the sun, and therefore would free me from the problem of actually watching the camera and adjusting it. My idea was to automate all the exposures using a program called “BackyardEOS” because Jeff had previously used the Nikon version during the 2017 eclipse that he viewed from Madras, Oregon. I had purchased an appropriate 15ft Mini-USB cable, and had done some previous experiments. As a backup plan, I had experimented with adjusting exposures manually, and tripping it with an inexpensive intervalometer. I had done tried this before during the October 2023 annular eclipse that we did as a dry run/practice. (I should really get those pictures here too).

But during our couple of days in the windup to the eclipse, I did some additional testing in our hotel room, and one thing became obvious: that BackyardEOS wasn’t really designed for eclipse photography. In particular, it had no idea what time the eclipse was, or even what time it was. If I wanted to preprogram a “plan” for the eclipse, I’d have to set it up, and test it manually/repeatedly. We experienced some situations where the software got ahead of the cameras ability to keep up, and then would lock up, which I thought would be stressful at minimum and disasterous at worst. So, I sought another situation.

I decided to use the program SETnC for my windows laptop. https://robertnufer.ch/06_computing/setnc/setnc_page.htm

It had a number of advantages and gave me some confidence that it might work better. It was designed specifically for eclipses, and had the data from the April 8th eclipse. Once I entered our latitude and longitude, it determined the exact time for our local circumstances. I then set it up to take a set of three exposures every five minutes during the partial phase, then about eight seconds ahead of second contact, to eight seconds after, it would snap a picture every second to catch “Baily’s Beads” and “the diamond ring”, and during the 4 minutes of totality, it would cycle with images from 1/1000 to 4 seconds. We bracket these exposures so long in an attempt to catch both details of the prominences, as well as details of the corona, and even (potentially) earth shine. I had originally intended to shoot these in RAW+JPG mode, but it was clear that my aging camera couldn’t keep up with my desired pace. With some reluctance, I set the camera to capture merely JPG pictures. In retrospect, I wonder if part of my poor performance is really due to the relatively pedestrian write speed of my budget SD cards.

Note to self: before next eclipse, do more extensive testing of write speeds of better cards, to see if I can do raw mode with a better card.

All photos were shot with a basic 75-300mm telephoto (about $150 new) at f/8 and ISO 100.

Or, at least that was my intention. I had two small problems:

Note to self: setting the ISO mode was tricky. On the day of, the first few minutes during partial eclipse set ISO to AUTO instead of 100. This was probably undesirable, and made the exposures rather hard to predict, and many of those photos seemed to be overexposed. It’s better to leave fewer decisions to the automatic camera settings. Make sure that ISO is set properly.

Additional note to self: I didn’t actually set the zoom to the full 300mm of the camera, despite that being my attention. I suspect that this was because I shot some quick test shots of the beach at a more modest zoom setting (230mm) and then never reset the camera. The extra 25% image scale would have been good to have.

Another note which I thought was odd: the SETnC program doesn’t understand local time zones. You have to set your laptop to be in UTC or it won’t do the right thing. This was less than entirely convenient, but once I realized that, it wasn’t hard to get it to do do what I wanted.

I did some test runs the day before, and had increasing confidence that it might do something good. It was exciting.

But the weather forecasts were… not promising. The weather maps indicated a band of clouds very closely followed the track of totality. We decided that on the morning of the 8th, we’d get up early and decide if we wanted to drive out of town, or risk it out near the beach. I was up at around 4:00am, couldn’t get back to sleep. We had arranged to meet with Jonathan (a geocaching acquaintance of Jeff’s) at 7:00 to make the final determination.

We had some high clouds that ranged from “very thin” to “troublingly dense”. We weren’t sure what was going to happen, but decided that it was probably no more likely to get better circumstances within an hour of driving, and there would be additional risks. We decided to setup at our hotel. About 9:00am, I headed down to scout.

Our hotel (the Palms Resort of Mazatlan) had been a pretty lively “party hotel” for saturday and sunday, but this was Monday, and seemed to be a bit more calm. We had a couple of places on the pool deck that looked like it could have been okay, but we instead decided to shift to the adjacent hotel’s pool deck, and set up.

I began to get hopeful. While there were still high clouds, they didn’t appear to be too dense. When partiality began, I had my laptop ready, my mount was tracking, and I had focused the best I could. (I did manual focusing, as I was not sure the autofocus would actually do better). I had the computer setup, but also rigged up the intervalometer/remote camera release. I was pleased to find that even while the computer was in control of exposures, I could also trigger the shutter by hand. I wasn’t certain that would work.

Here I am with 15 minutes to go:

Once the partial phase had begun, i had three issues:

First, the Auto ISO issue I mentioned above. I had temporarily paused the automatic mode of SETnC, did a tweak, and then set it running again. Oddly, it then reran all the events which had occurred up to the current time, but then seemed to be acquiring the new photos in the right node. No harm, no foul.

Secondly, I did manage to get the software into its “test” mode. In test mode, it advances the clock to the time just five seconds before the next “event”. This is helpful when you are testing the day before, but it was somehow triggered accidently, probably because it was hard to read the screen of my laptop in the son.

Lastly, when I took it back out of “test” mode, for some reason it informed me that it wouldn’t do any additional partial phase photos for 8 minutes. This was because in test mode it had thought it was 8 minutes later, and so those things were “done”. This is where my intervalometer/camera release came in handy. I just snapped individual photos at more or less random intervals until the software plan caught up to “real” time.

There continued to be high clouds, but through our mylar glasses, would continued to be able to see the clear partial phases. Here is a (lightly) post-processed image of the partial phase, showing the most prominent sunspot.

Jeff had setup his Gopro beneath his camera tripod, aimed out at the ocean and later uploaded this footage of the entirety of totality (or is that the totality of entirety?) In real time, it’s hard to see the upcoming lunar shadow (approaching at something like 1500mph) but if you scrub through it you can see it pretty clearly.

https://youtu.be/_uvY844okfc?si=xZSppEzsWDOzz2k4

As the countdown got closer, the quality of the light got strange, and then dimmer. At about 12m45s into the video, you can hear me call out that “it’s going!” and then around 13m10s, totality begins.

My camera setup worked really well. I shot 410 photos overall. Here is the best of the best, cropped from their originals, but processed only very minimally.

I had time to record some video of myself. Pardon my language in the first little bit. I didn’t think my Google Pro 6 would do a good job of recording the eclipse, so instead I just recorded a selfie of myself, talking about what I was seeing. I must admit: I was oddly emotional. I’m not the kind of guy who never cries, but neither is it a common occurrence. In the background you can hear the voice of an opera singer, who was standing near by and decided to sing. It was amazing. It’s hard to describe the actual appearance of totality. The combination of the super-bright “bailies beads”, with the halo of the corona against the dark sky, the appearance of Venus and Jupiter. It was indescribable.

And then, four minutes later, it was over. I was enormously excited to get back to the hotel room to see how the pictures turned out. I was enormously pleased. WIthin an hour I had my first photo up on Facebook, and it appeared that I may have had one of the earliest photos, and while the pictures weren’t the most astounding technically, I was pretty damned happy and proud that they had worked out. Pretty awesome for a first time eclipse photographer.

We had a blast. It was great to spend time with my friend Jeff, and my new friend Jonathan. We ate a lot of Mexican food, and enjoyed ourselves throroughly. We both caught COVID on the way back, which accounts for some of why this account is a bit late, but it was totally something that ticks my bucket list. Thanks to Jeff for being my stalwart friend and traveling companion, and I urge anyone who can get in the path of totality to try to do it.

Truly fucking amazing.

Restarting the brainwagon blog?

I wonder if i trained a large language model on the contents of this blog and used it to generate new posts, whether it would generate interesting enough stuff to at least shame me into creating new posts?

This would require that I actually learn something about this topic at least. Although it probably would also require some hardware that I currently don’t possess.

brainwagon is 20 years old today!

It was twenty years ago today that I first posted something to my brainwagon blog. While I have sort of fallen out of the habit of posting to this site, it still remains as an testament to my inability to concentrate on a single topic for more than a couple of days. I keep thinking that I should stop posting to Quora, and should instead refocus my efforts to the sorts of things that I used to routinely blog about, but I haven’t quite gotten back into it. It’s not that I have stopped doing nerdy things. I still am doing woodworking. I want to get back to rebuilding my first telescope. And I’ve spent more than a little time building a “homelab” computing setup. But I haven’t mustered the degree of concentration and the sense of community that used to drive me to blather on inside these pages.

Oh, and I have been caring for stray cats too.

I hope all of you are well.

Experimenting with ESP8266/Tasmota Firmware…

Hey gang, I know it’s been quite some time (since last May apparently) since I posted anything new on the blog. It is not that I haven’t been doing projects. The continuation of the COVID-19 pandemic generally means that I’ve had a lot of extra time, and have been tinkering with a bunch of different projects and learning new skills. I just haven’t felt much like writing them up.

But I realize that I miss some of the interactions that writing a blog brought about, so maybe it would be good to write up a detail or two of some of the projects. We’ll see how successful I am.

Today’s projects will center around microcontrollers based around the ESP8266 (notably the WEMOS D1 Mini are some of my favorites) and the Tasmota firmware.

The WEMOS D1 Mini

If you haven’t encountered the ESP8266 microprocessor before, you can use google, but the basic idea is that it’s a small controller which is both very cheap and allows access to WiFi. They come on various boards, but one of the most popular is a small board which is called the Wemos D1 Mini. I’ve used them in a few of my own projects before, including an internet enabled clock and my old half life clock that I built a while ago. Did I mention they were cheap? You can get five of them for $17 or so from Amazon. That’s even cheaper than clone Arduino Nanos, and did I mention they have Wifi? They have Wifi.

Programming with platformio

The Arduino has been popular in part because it has a friendly set of libraries and an IDE that can be used to program them. It turns out that with a little work, you can pretend that the ESP8266 is just a different type of Arduino, and all your skills could transfer into programming these things.

But I prefer to http://platformio.org which is a more command line driven approach. You still program the same way, but you can use your favorite editor (vi for me) to create the source code, and can compile and install using simple command line tools. It also provides convenient access to a lot of different libraries.

Using platformio I had created a bunch of different projects over the years. For instance, I created this clock to download ISS data from the Internet and provide a small display with the location of the ISS.

I’d also made clocks, and a variety of adhoc sensors like https://brainwagon.org/2018/12/09/how-not-to-code-an-simple-iot-sensor-and-a-new-task-list/comment-page-1/. But each time I wanted to do a fairly simple sensor project, it kind of meant an afternoon of programming. Granted, not particularly difficult programming, but neither was it trivial. I kind of wish there would be a simpler way I could attach a simple sensor to the esp8266, and get it routed to a server for logging, graphing or data analysis.

Tasmota Firmware

A couple of weeks ago, I was doing my usual browsing/tinkering, and encountered something which I hadn’t considered before. https://github.com/arendst/Tasmota is a firmware that can be downloaded to ESP8266 (and more modern ESP32 boards) that are often used for IOT peripherals. I had used it before when I experimented with SONOFF switches. Here’s the product page. These are cool because using them you can create a switch which doesn’t rely on any cloud architecture to run: you can control it with simple MQTT or HTTP messages. But I had missed a couple of things that I hadn’t realized before.

First of all, you can install the Tasmota firmware very easily on the WEMOS D1 Mini. The easiest way is to bring up the Tasmota Web Installer on chrome, and select any one of a bunch of precompiled versions of the Tasmota firmware, each with different sets of potential sensors or capabilities. You then simply add your sensors to the board, fire it up and configure it’s wifi and MQTT settings, and you have a capable little sensor.

The first of the many applications that I saw was actually something I was interested in. Ikea sells an air particle sensor box which costs just $13.00. This is considerably cheaper than some of the other sensors I’d experimented with before. But out of the box, it just lights an LED bar to indicate the air quality (green for low, yellow and red for higher levels). By itself, that sensor is not particularly useful. I want to have quantitative data, and to be able to log the data to an MQTT server.

Luckily, someone had done the heavy lifting before me.

A quick trip to Ikea purchased a pair of these little guys. This afternoon, I opened one of them up and did the necessary modifications to add a WEMOS D1 Mini with the appropriate firmware.

I could have added another sensor directly in case (there is plenty of space) but I chose to simply create a second WEMOS that used a SHT30 temperature/humidity sensor that I had a little carrier board for. Both send their data to an MQTT server.

Node Red front end

I could have written a little Python script to slurp up data from the MQTT server and produce graphs and the like, but there is an interesting alternative: Node Red. It’s a sort of graphical programming system that allows you to wire up data sources (like MQTT inputs), process them in various ways, and send them to various other outputs. It is also a convenient front end for creating UI elements that respond. After an hour or so of tinkering, I had the following:

Node Red UI elements

Not too shabby. I experimented with similar things before, and also had the data injected into an InfluxDB database, which provides for linger term storage. I’ll probably work that up again.

A couple of years ago, I also did similar data logging using the INA219 voltage/current sensors on a small solar power setup that I created. At that time, I used custom firmware but I now believe that I could do the entire project without any of that programming. I could simply make a couple of small modules that run Tasmota, and do all the data logging with MQTT and Node Red.

I also discovered that the Tasmota firmware also can serve as a controller for a display device. I had an 8 digit 7 segment display controlled by a MAX7219 chip, which is one of the potential displays that the Tasmota firmware knows about (it also nows about a variety of E-ink and TFT displays). You can send commands to the board using HTTP or MQTT to tell it to send information to the display. In a few minutes, I had it displaying the time, essentially making a small internet clock. That seems pretty cool. I ordered some small OLED displays that I can do more experiments with. I’ll probably need to compile a custom version of firmware to use both the sensors I want and the displays, but it seems like an interesting thing to play around with.

Future tinkering

It’s a fun thing to play with. Inexpensive sensors and displays, wired into your own servers, with little-to-no programming. I like it, and will be looking for other possible projects to make use of my new knowledge.

Hope some of you found this interesting.