A (not entirely simple) LCD display for the Arduino…

April 21, 2015 | Amateur Radio, Arduino, My Projects | By: Mark VandeWettering

I am a big fan of Bill Meara N2CQR and Pete Juliano N6QW, hosts of the really great Soldersmoke Podcast. Together, they chat about homebrewing ham radio equipment, and what they’ve learned in their lessons along the way. Their “tribal knowledge” is of terrific help to someone like me who keeps making small forays into the world of homebrew.

Warning: this post may be written at a level either below or above any readers experience, and either might find it boring. You’ve been warned. Additional warning: I probably made this more complicated than it should have been. Skip to the bottom to find the resolution.

During Soldersmoke 175, they expressed some disgruntlement with what I call “the Arduino Tower of Babel”. Despite the reputation of the Arduino being the easiest way to get into using microcontrollers in your own homebrew electronics project, it can be really daunting and fraught with frustration and peril. In particular, they seemed to be having problems with trying to get various “sketches” to compile and run properly, depending on what version of the Arduino IDE they were running. Via e-mail, I offered to try to help out, perhaps being as the Arduino Sherpa that could guide them to success. While I know there are lots of people out there who are more skilled, knowledgeable and experienced than I, I have enough general computer experience to often be able to sort out this kind of problem. I thought that instead of writing this all down as an email to them both, this might serve as a good bit of knowledge of general interest to those just getting started in using the Arduino and/or programming. A lot of this won’t come as much of a surprise to practitioners of the digital arts, but perhaps it might be of some use to someone (and hopefully Bill and Pete, although it appears that Bill has at least made some headway).

First of all, the way most people interact with the Arduino is through the Interactive Development Environment, commonly referred to as the IDE. It is a pretty simple looking program (well, as such things go, it can be daunting for beginners) which is actually a wrapper around several different components. What the user typically sees is a window where he can enter “sketches” (what most people would call “programs”), and a series of buttons that will allow you to load, save, compile and download code to the target Arduino board that is connected via the USB port.

Like most software that is popular, it’s being constantly revised: new versions are being created all the time. Because it is what is called an “open source” project, it isn’t a single company that is responsible for changes, it evolves by the contribution of many different contributors. Each “release” of the code is tagged with a version number. As of this date, the latest version of the Arduino that you can download from the primary site is 1.6.3. Because people hate to upgrade software though, many people are using older versions of the software, with some common versions being 1.0.5.

I mentioned that the Arduino IDE consisted of many different components: among these are a set of standardized “libraries” that encapsulate common functionality that lots of people find useful. When you use the Serial or Wire libraries, you are actually using code that is shipped with the Arduino IDE. These “standard” libraries usually work well right out of the box, and don’t change all that often. Thus, if you had a sketch which uses those libraries, or even more basic calls like digitalRead or digitalWrite, you probably won’t notice a lot of differences.

But some code is not part of the standard distribution. For instance, Pete was having difficulty getting an LCD display unit working properly with different versions of the Arduino IDE. I thought it might be a fun excuse to pick up an LCD panel to play with, so I asked him which one he used, and he emailed back this link, which seemed like a cool device. A mouse click, and two days via Amazon Prime, and I had one in my hand.

IMG_0009

IMG_0010

A pretty nice little unit, a little bigger than I expected. If you look at the back, you’ll see that it’s got a little board that looks a bit out of place connected to the back. It’s made by the company Sainsmart, and has four simple pins connected to the back. That board is an converter which turns the display board (which are fairly common, but require a lot more connections) to instead use what is called the “I2C bus”. To get this display working requires a lot less wiring: just +5v and ground, and then two data pins, called SDA and SCL (the “serial data” and “serial clock”, respectively). This makes hardware hookup a lot easier than the conventional (and somewhat cheaper) models.

If you had an ordinary LCD without this backpack, you could use the standard “LiquidCrystal” library that ships with the Arduino IDE (documented here.) If you read the manual page for the “constructor” (the statement which creates a “LiquidCrystal” object that you can interact with), you can see that there are a bunch of ways to create one, depending on how you wire it up. These standard LCD panels require somewhere between six and eleven connections (plus power and ground) which can be a headache.

By contrast, this panel requires only two lines. Awesome! What’s even more awesome is that you can attach other devices that use the I2C bus to the same two lines. Each peripheral has a unique “address”, so programs can talk to each device independently, without adding any more wiring.

But there is a seemingly small problem, one that is familiar to users of desktop computers. To use these special devices, you need a custom library (think of a custom Windows device driver) that knows how to talk to this device. And here begins the problems that Pete and Bill had.

The libraries that ship with the official IDE are usually pretty well thought out, checked to make sure that they work well with the IDE, and are compatible. But this requires a custom library, and those libraries are not always tested against all versions of the IDE. Sometimes they work. Sometimes, not so much.

And, what’s worse is that this code isn’t versioned or vetted. You can have different versions of the code with the same library name. It’s hard to know what versions are the best, which are later revisions and which are earlier, and which were created or modified by, shall we say, less good programmers?

Okay, back to our LCD display.

I did what I always do, I googled and found this this version of something called LiquidCrystal_I2C as the top response. That seemed promising. Version 2.0! I downloaded it and installed it (by the way, installing libraries can be annoying in the Arduino IDE, maybe I will rant about that some other day, but you can find out the “right” way to do it here), opened their “Hello World” program, compiled and downloaded and…

Nothing.

Screen glitched a little, and rows of black squares. Argh.

Double checked the wiring. Nothing seemed to be wrong. Hmmm.

Deleted that version of the library, and after some judicious surfing, uncovered this driver on the dfrobot website. They make a board which looks an awful lot like the sainsmart board. I thought I’d give it a whirl, even though it’s version 1.1 (and therefore presumably older).

Results:

IMG_0011

IMG_0012

It’s at times like this that I feel waves of, well, if not rage then annoyance. I haven’t had the chance to figure out what the issue is (I’m an hour into this already, and it’s supper time) and have no doubt that I’ll be able to figure out what’s going on, but it’s annoying to beginners and experts alike that we have to do this kind of spelunking. Until I sort out this issue, I’ll just make a few recommendations:

  • If you can use hardware supported by the standard libraries that ship with the Arduino, it’s probably worth doing.
  • If you can’t (or choose not to) then perhaps do some searches to find what other people are doing to get stuff working.
  • Document your success and failures as best you can on the web somewhere. Be specific as to which version and platform (Windows, Mac, Linux) you are using.
  • I actually recommend using the latest version of the IDE that’s available at arduino.cc, the official website. The older versions may “work just fine”, but you aren’t going to be able to take advantage of the many bug fixes and updates, and if you are interacting with other newbies, your code may not work. Best to get your own house in order, and then throw stones at whoever has code which doesn’t work properly with the latest official IDE.

To Bill and Pete: I feel your pain. I’d expect a device as common as this to work more or less out of the box. I’ll see if I can make a better suggestion soon. You might try using the code that worked for me, but that’s a poor solution really: I’m recommending using a modern IDE with old code. I’ll work on coming up with a better solution.

Addendum: Pete, in your email you indicated that the I2C address for your board was 0x3F. On mine, it actually turned out to be 0x27. I found this out by using an “I2C Bus Scanner”, a little sketch that runs on the Arduino and tries to find any devices by running through all 128 addresses. I was shocked to find that it’s not part of the standard examples, but if you google for “arduino i2c bus scanner” you can find code for many simple examples. It should be noted that this one I found screwed me up for a few minutes by printing the address in decimal, rather than hex.

Addendum2: Sigh. I may have been working too hard. Looking at the “official” distribution, it appears that the library does support I2C displays, although as near as I can tell, it’s completely undocumented, and none of the examples will work out of the box. I’ll figure out the right juju to get it to work soon, and will post it below.

Addendum3: In the words of Bill and Pete:


BASTA!!!!!!

Giving up for a moment, it’s acting stupidly on one of my dev machines. Hopefully what I said was not entirely wrong above, but it might be.

Addendum4: I was confused, but the rabbit hole keeps getting deeper. The version installed with 1.6.3 does not support I2C LCD displays. I was misled by looking at my installation on my Mac, which is not the standard 1.6.3, but is based on a system called platformio. When it installs code for the Arduino, it installs this version of the LCD library. You can download the code for it here. It supports both the traditional 8 and 4 bit parallel interfaces, as well as the I2C based version, and seems to be well documented. One bummer: it’s thought of as a direct drop in replacement for the standard system library, so you basically have to delete the installed LiquidCrystal library, and replace it with this one. Read the instructions here. It all looks good, except for one thing:

It doesn’t seem to work properly with the Sainsmart interface either.


BASTA!!!!!!

I know that part of this is that there are dozens of clones and near clones out there, and it’s hard for the library writers to know about all of them, but this is genuinely crazy.

Addendum5: Wow, this is totally crazy. If you go to the Sainsmart page for the “LCD2004”, you’ll find a rar file which includes the library to access this hardware. Except of course, that library is years old and will only work with versions 1.0 of the Arduino IDE.


BASTA!!!!!!

That’s it Bill and Pete. I’m giving up on digital electronics, and am going to spend the rest of the evening looking for good deals on dual gate FETs and crystals for filters.

New I2C peripheral: 6 DOF IMU, $5.89

April 20, 2015 | Arduino, My Projects | By: Mark VandeWettering

GYU-52This little gadget arrived via Amazon Prime today: a three axis gyroscope/accelerometer that can be programmed via the I2C bus. I didn’t really have any reason to get one, other than simple curiosity, although I suspect that possibly mounting one on my (as yet unfinished) robot platform might be able to determine motion parameters of the robot.

The board itself is well documented on the Arduino Playground as the MPU-6050 Accelerometer + Gyro. The pinout is rather simple: your normal +5 and GND, SDA and SCL (I2C bus serial data and clock) one address pin (which lets you decide between two addresses for the chip) and an interrupt pin. The chip includes a FIFO buffer, and whenever the chip places data in the FIFO, it triggers the interrupt, indicating that there is data available for reading. Additionally, the chip includes two other lines (XDA AND XCL) which are a separate I2C bus that the chip can use to talk to a magnetometer. Probably won’t be using that anytime soon, but you never know.

IMG_0005As you can see from the picture, it’s really quite small, about the size of a postage stamp, and includes two mounting holes. When I get home, I think I’ll measure up the dimensions and put together the design for a little plastic case that can snap over it and provide some protection for it. The kit includes both right angle and normal headers, I think the right angle will do nicely.

More later.

Addendum: It’s later. Got the thing hooked up. It appears that I needed to ground the AD0 pin to set the address properly (I should double check this, I thought that if I left it floating, it would default to 0x69). Other than that, it’s dead simple. I wrote this code to get raw acceleration and gyro values from the sensor. I’m told that if you divide the raw values by 16384, you get the acceleration in terms of the gravitational acceleration “g”. In other words, if the board was lying perfectly flat on the tabletop, you’d expect that the X and Y accelerations were zero, and the Z would be 16384. Here’s a screen grab:

Screen Shot 2015-04-20 at 8.50.37 PM

As you can see, I wasn’t really quite flat, mostly tilted in X. If you find the lengths of the acceleration vectors, you find it’s something like 0.92 g (we are off by 8% or so). I don’t know what the specs on this thing are, I’ll have to check the datasheet. I know that despite returning 16 bit values, it does not provide 16 bits of accuracy.

Anyway, here’s a simple test sketch:

[sourcecode lang=”cpp”]
//
// A short sketch to read data from the MPU-6050, aka the GY-51
//
// Cribbed from online sources by mvandewettering@gmail.com

#include <Wire.h>

const int MPU = 0x68 ; // I did ground the A0 pin…

int16_t acc_x, acc_y, acc_z ;
int16_t temperature ;
int16_t gyr_x, gyr_y, gyr_z ;

void
setup()
{
Wire.begin() ;
Wire.beginTransmission(MPU) ;
Wire.write(0x6B) ;
Wire.write(0) ;
Wire.endTransmission(true) ;
Serial.begin(9600) ;
}

void
loop()
{
Wire.beginTransmission(MPU) ;
Wire.write(0x3B) ;
Wire.endTransmission(false) ;

Wire.requestFrom(MPU, 14, true) ;

acc_x = (Wire.read() << 8) | Wire.read() ;
acc_y = (Wire.read() << 8) | Wire.read() ;
acc_z = (Wire.read() << 8) | Wire.read() ;

temperature = (Wire.read() << 8) | Wire.read() ;

gyr_x = (Wire.read() << 8) | Wire.read() ;
gyr_y = (Wire.read() << 8) | Wire.read() ;
gyr_z = (Wire.read() << 8) | Wire.read() ;

Serial.print("\e[2J\e[H") ;
Serial.println("RAW MPU-6050 DATA") ;
Serial.println() ;
Serial.print("ACCX ") ; Serial.println(acc_x) ;
Serial.print("ACCY ") ; Serial.println(acc_y) ;
Serial.print("ACCZ ") ; Serial.println(acc_z) ;

Serial.print("GYRX ") ; Serial.println(gyr_x) ;
Serial.print("GYRY ") ; Serial.println(gyr_y) ;
Serial.print("GYRZ ") ; Serial.println(gyr_z) ;

Serial.print("TEMP ") ; Serial.println(temperature / 340. + 36.53) ;

Serial.println() ;

delay(1000) ;
}
[/sourcecode]

Addendum2: Here is a library for accessing the MPU-6050 well (it can return quaternions for orientation, no gimbal lock!).

Tinkering with individually addressable LEDs…

April 19, 2015 | Arduino, LED, Microcontrollers, My Projects | By: Mark VandeWettering

Blinking LEDS...While digging around looking for an LCD module I thought I had stashed somewhere, I encountered a bag with some of 8mm individually addressable RGB LEDs that I had never done anything with. For fun, I thought I’d wire a few of them up on my breadboard and see if I could get them to do something.

These things are cool. Most RGB LEDs have four leads, with one common annode (or cathode) and three other pins, each of which connects to the different color LED. To dim them and generate arbitrary colors, you need to have three pins which are attached to a pulse width modulated pin. To address a bunch of them individually would require a bunch of pins.

But these LEDs are different. It’s true, they have only four pins. But they are constructed to act independently. The four pins are a 5v power pin, a ground, and a data-in and data-out. You can chain arbitrary numbers of them together by hooking the data-out from one LED into the data-in of the next.

To demonstrate this, I thought I’d hook hook three of them together, and see what I could do. It wasn’t obvious to me from the datasheet that I found how the pins were label. Luckily, AdaFruit had a nice diagram that showed how the pins were ordered. I positioned the flat side of each led to the right, hooked all the +5V and ground connections up, and then wired the data-out of each stage to the data-in of the next stage. I then looked up the Arduino, which is dead simple: one output pin to the first LED data-in, and hook up the common ground.

led

To drive this requires a bit of code. The library that everyone seems to use is the Adafruit NeoPixel library. Oh, did I mention? The same LEDs are found in preformed strips like this one. If you need a lot of LEDs in a strip, this is a good way to go. But for just a few LEDs, these through hole parts can be fun.

I downloaded the Adafruit library, and wrote up this chunk of code.

[sourcecode lang=”cpp”]
#include <Adafruit_NeoPixel.h>

const int led_pin = 6 ;

Adafruit_NeoPixel strip(3, led_pin, NEO_RGB + NEO_KHZ800);

void
setup()
{
strip.begin();
}

int cnt = 0 ;

int col[3][3] = {
{255, 0, 0},
{0, 255, 0},
{0, 0, 255}
} ;

void
loop()
{
int i, idx ;
for (i=0; i<3; i++) {
idx = (cnt + i) % 3 ;
strip.setPixelColor(i, strip.Color(col[idx][0], col[idx][1], col[idx][2]));
}
strip.show() ;
cnt++ ;
cnt %= 3 ;
delay(500) ;
}
[/sourcecode]

Nothing too exciting, but it’s been a fun little thing to tinker with. It’s nifty to drive a large number of LEDs using only a single digital pin from a microcontroller. I’ll have to come up with a project to use them sometime.

More camera experiments…

April 18, 2015 | My Projects, Raspberry Pi | By: Mark VandeWettering

Tonight’s tinkering was inspired by the script by spikedrba that I mentioned in yesterday’s post. I took down the hummingbird camera for a little maintenance, and while it was down decided to do some bench testing with new ideas inspired by what I read.

Sadly, I didn’t have anything as photogenic as hummingbirds to stare at, so instead I just pointed it me in my slightly darkened living room as I hacked on the couch. The video is incredibly boring, but I will post a single frame:

Screen Shot 2015-04-18 at 12.18.25 AM

First of all, I’ve added a text annotation with the time and date to every frame. In my hummingbird camera application, it’s not clear to me that I want it overlaying every frame, but it’s probably useful in a variety of security applications, so I thought it was worth trying. On the line below, you can see three numbers, which represent the load averaged over one, five and ten minutes, followed by two numbers. The first is the number of non zero-length motion vectors that the camera returns, and the second is the sum of the absolute value of differences between adjacent frames. Currently, this application was recording 1280×720 video at 25 fps, and you can see it was using around 36% of the available cpu. Not bad at all. While this version of the script doesn’t actually trigger motion detection recording, it is probably doing virtually all the work that such a script would do, so it’s pretty clear that my stock, non-overclocked model B can easily keep up at this frame rate and resolution.

Spikedrba’s script was very instrumental in figuring out how to setup the pipeline properly to handle this. I also spent some time reading more of the discussion on the picamera github page, and reading the code for the module itself. I’m really very impressed by this.

Once I tighten this up a bit more, I’ll be posting a new revision.

Motion detection in my hummingbird camera…

April 17, 2015 | My Projects, Photography, Raspberry Pi | By: Mark VandeWettering

My goal in experimenting with the Raspberry Pi camera was to try to make an efficient and effective camera which can detect motion. Previous incarnations of the camera script merely looked at the differences in pixel values between adjacent frames, thresholded them at some value, and then counted the number of pixels which exceeded this value. What I discovered was that it was pretty hard to tune the two threshold values in a way that would not pick up changes due to wind motion of the grassy background.

But it turns out that the Raspberry Pi Camera and its associated software picamera has some other tricks up their sleeves. In addition to recording the h264 encoded video, you can record an alternative stream which contains “motion data”, which is essentially some of the raw data that is used by the h264 to do motion coding. Essentially this data provides 4 bytes of data for each 16×16 image block: two signed 8 bit image displacements (in x and y) which represents the estimated image velocity, and a 16 bit value which is the sum of the absolute difference of all the pixels in the block from the previous frame. Both would be rather expensive to compute (certainly in Python) but are quick and easy to extract when computed by the camera itself.

To test my understanding, I modified my camera script to acquire this data, and then transferred it along with the normal video, and then hacked together some scripts using python and gnuplot to superimpose this data atop the background video (which I’ve faded a bit to make the data more legible). The black contours represent the difference data, and are spaced at intervals of 100. The red vectors represent the motion data plotted atop the image.

One thing leaps out at me immediately: the motion data is very good at finding the hummingbirds, even when the birds are relatively stationary. While this clip was not taken in particularly high wind, it’s pretty clear that those vectors aren’t very large in the case of plant motion. Hence, it seems clear I could make a better motion detector by taking advantage of the precomputed motion vectors.

A couple of things remain though: there are obviously drop outs where the contour data drops out entirely. I’m not sure what that is about: it could be a bug in my conversion script, or something more insidious. I’ll go back to the data and find out. Secondly, I’m not sure how capturing this motion data interacts with another feature I use of the picamera: it’s ability to record into circular memory buffers. When I figure out these two issues, I’ll post (and likely github) another version of my watcher script.

Hope this is of interest to someone out there.

Addendum: While doing more reading on the picamera github site, I found a link to this awesome script, which points out a lot of clever things that can be done. I’ll be swiping ideas from it soon!

StarStack… Astrophotography with Cell Phones?

April 13, 2015 | Amateur Science, Astronomy | By: Mark VandeWettering

horseheadTom pointed me at this awesome article about an experiment run as part of the BBC programming Stargazing Live. Basically, they asked their viewers to go outside with their cell phones and take a picture of the night sky with their cell phones and upload the (almost entirely black) images to a website. They then used a process called “stacking” that basically aligned all the pixels and added them together. The net result was perhaps better than anyone has anyone had any expectation of getting. Very, very cool.


If you want to see the full image they constructed, click on the version below and you’ll get the resolution they achieved, including some views of the Great Nebula in Orion:

orion

And, as it happens, the kind of thing that I could easily do with the hardware and software that I have lying around. I’m putting this onn my list of experiments to run, maybe with my iPhone, but more likely with the Pi Camera (easier to automate).

Addendum: Glancing at my copy of the Uranometria 2000, my “goto” star atlas, it’s clear that the picture I linked above has all the stars in the Uranometria, which is supposed to go down to about magnitude 9.75. Under dark skies, you’d only see down to magnitude 6 or so. Stars which are magnitude 9.75 are about 32 times dimmer, ordinarily you’d need to use a telescope with a 50mm aperature (a large-ish finder scope) to see these stars.

SSTV from the ISS…

April 11, 2015 | Amateur Radio, Amateur Satellite, Space, SSTV | By: Mark VandeWettering

Well, it’s not pretty, but I was just using a 17″ whip antenna on my VX-8GR, recorded it with Audacity, and then decoded it with MultiScan on my Macbook. The first bit of the recording is pretty rocky, so I had to start the sync myself. I’ve bean meaning to do some experiments with bad audio and sync recovery, now I have more data.

Oh, in case this was all gibberish to you, the Russians have been running “events” from the International Space Station to honor their cosmonauts by transmitting pictures via slow scan television (SSTV). I received this picture using what most people would call a walkie talkie, a whip antenna, and a laptop.

As decoded by Multiscan:

iss

I thought a second image would have begun later in the pass, but didn’t hear it.

I think an antenna with a little more gain, and/or a preamplifier would help a lot. You really need pretty noise free audio to make a good picture. Still, a fun experiment. I might try the 12:30AM pass tonight.

Addendum: The second pass was also a little rocky. Got the tail end of one transmission fairly cleanly, but the three minute gap to the next one meant it was low. This is what I got.

second pass

More on hummingbirds…

April 10, 2015 | Amateur Radio | By: Mark VandeWettering

I imagine that some of you are getting bored with this, so I won’t post another 20 minutes of hummingbird video. But I will post a couple of things. For instance, I cut a frame from the videos at the beginning of the day and at the end of the day. You can clearly see the level of nectar in the feeder drop. They don’t seem to eat a lot.

loop

John wondered (quite rightly) whether I could get the camera positioned in a better position so I could get something other than the near silhouette images that I had yesterday. I’d have to move the camera outdoors, which means I’d have to make a better enclosure. But tonight the setting sun was fairly low and sidelit the birds in a couple of my late captures. Here is a late still frame…

Screen Shot 2015-04-10 at 8.15.45 PM

And some of the video shot late in the day…



Slow Scan Television from the ISS this weekend…

April 10, 2015 | Amateur Radio, Amateur Satellite, SSTV | By: Mark VandeWettering

Note: This post was adapted by an email that I sent out to our ham radio club.

If anyone is interested in a fun little ham radio related activitytonight, you can try to receive slow scan television from the International Space Station this weekend. I haven’t done this in a while,but I think I’ll give it a try and see what I can come up with.

You can read about this event here:

AMSAT UK on the upcoming ISS event

They will be on 145.800Mhz (in the 2m band).

The way I usually “work” these is to use one of my HTs. A better antenna than the stock one is usually good (a longer whip, or even a yagi) but you might just see what you can here with the stock antenna. The ISS transmits with 25 watts of power, which is usually pretty easy to hear. I have a set of earphones that I hook with a splitter. One half goes to my earbuds, the other to a small digital audio recorder I have. Turn the squelch on your radio off so you can here the signal when it is weak. You may find that moving your antenna round will help a bit, so monitor with your earphones. Don’t be shocked if you don’t hear the ISS right at the rise time: it has 3 minutes of dead time between transmissions, which take about 3 minutes to send. It sounds a bit like a ticking of a clock, with a whistle in between, if you click this link, you can hear what it sounds like:

I like to record the audio, then play it back into my Windows PC and use the MMSSTV program, but you can actually go completely low tech and try an inexpensive iphone app, held up to the speaker of your HT. I use

Black Cat System’s SSTV program for the iPhone/Ipad

which works okay, not amazing. If you are out doors in a windy or noisy location, your image won’t be as good this way: the bg noise will cause interference.

To help out, I computed a set of rise/set/max elevation tables centered on San Francisco. If you live close, you can probably use these times. If you live in other parts of the country, you might try looking at the Heaven’s Above website. Select “Passes to include” to be all, and enter your location in the upper right. The table below was calculated by my own software.

--------------------------------------------------------------------------------
Rise time           Azi    Max Elev Time        Elev  Set time             Azi
--------------------------------------------------------------------------------
2015/04/11 16:24:33 178.90 2015/04/11 16:28:52   9.27 2015/04/11 16:33:10  74.10 (Local Time)
2015/04/11 23:24:34 178.90 2015/04/11 23:28:52   9.27 2015/04/11 23:33:11  74.10 (UTC)

2015/04/11 17:59:18 232.14 2015/04/11 18:04:47  76.70 2015/04/11 18:10:17  49.52 (Local Time) [1]
2015/04/12 00:59:18 232.14 2015/04/12 01:04:48  76.70 2015/04/12 01:10:17  49.52 (UTC)

2015/04/11 19:36:48 276.47 2015/04/11 19:41:38  13.93 2015/04/11 19:46:28  40.34 (Local Time)
2015/04/12 02:36:48 276.47 2015/04/12 02:41:38  13.93 2015/04/12 02:46:28  40.34 (UTC)

2015/04/11 21:15:06 309.66 2015/04/11 21:19:13   7.29 2015/04/11 21:23:21  47.92 (Local Time)
2015/04/12 04:15:06 309.66 2015/04/12 04:19:14   7.29 2015/04/12 04:23:21  47.92 (UTC)

2015/04/11 22:52:10 319.85 2015/04/11 22:56:52  12.34 2015/04/11 23:01:34  78.97 (Local Time) [2]
2015/04/12 05:52:10 319.85 2015/04/12 05:56:53  12.34 2015/04/12 06:01:35  78.97 (UTC)

2015/04/12 00:28:22 312.09 2015/04/12 00:33:48  58.58 2015/04/12 00:39:14 122.75 (Local Time) [3]
2015/04/12 07:28:22 312.09 2015/04/12 07:33:49  58.58 2015/04/12 07:39:15 122.75 (UTC)

2015/04/12 02:05:15 289.69 2015/04/12 02:09:49  11.95 2015/04/12 02:14:23 174.60 (Local Time)
2015/04/12 09:05:16 289.69 2015/04/12 09:09:50  11.95 2015/04/12 09:14:24 174.60 (UTC)

[1] Probably the easiest pass, the ISS passes almost straight overhead,
should be loud and easy.

[2] A low night time pass, but the ISS should be visible to the naked eye.

[3] Another night time pass, but too late for the ISS to catch any
sun. 58 degrees is a good pass, the second one.

If I get any good images, I’ll send them out next week.

Skeleton of a motion detecting video capture program for the Raspberry Pi + Camera…

April 10, 2015 | My Photos, Raspberry Pi | By: Mark VandeWettering

Last week I was playing around with using “motion-mmal” to capture pictures of hummingbirds feeding at my feeder. That was fun, but if I wanted to get high resolution pictures, I could not get very high frame rates (maybe 2-5 fps at best). I thought that perhaps by writing my own capture application in C, perhaps I could do better. After all, the graphics processor in the Pi is capable of recording HD video and directly encode it as H264 video. There should be some way to use that hardware effectively, right?

As it turns out, there is.

As a tease, here is some of the video I captured yesterday:



It’s recorded at 1280×720 and 25fps (more on that later). It takes about 20% of the cpu available on one of my older Model B Raspberry Pi. The motion detection is done on the camera entirely in Python, and is a bit crufty, but works well enough to get some good video.

Warning: this code is presented as-is. If you aren’t a python programmer, you may not have the skills necessary to understand or use this code, but it is a good basic outline that spells out most of the parts you need. Feel free to adapt the code to your needs. If you redistribute it, it would be nice if you could give a nod to this code and my blog in some fashion, but I’m not going to be insulted if you don’t. And if you have any improvements, I’d love to hear about them.

[sourcecode lang=”python”]
#!/usr/bin/env python

# __ __
# _ _____ _/ /_____/ / ___ ____
# | |/|/ / _ `/ __/ __/ _ \/ -_) __/
# |__,__/\_,_/\__/\__/_//_/\__/_/
#
#

import numpy as np
import io
import os
import os.path
import fractions
import time
import random
import picamera
import picamera.array
import datetime as dt
import warnings
import platform
from pkg_resources import require
import subprocess

print platform.platform()
print "Using picamera version", require(‘picamera’)[0].version

#warnings.filterwarnings(‘default’, category=DeprecationWarning)

prev_image = None
image = None

def detect_motion(camera):
global image, prev_image
with picamera.array.PiYUVArray(camera, size=(256,144)) as stream:
camera.capture(stream, format=’yuv’, use_video_port=True, resize=(256,144))
#print "%dx%d:%d image" % (stream.array.shape[1], stream.array.shape[0], stream.array.shape[2])
if prev_image is None:
prev_image = stream.array.reshape([256*144, 3])[:,0]
return False
else:
image = stream.array.reshape([256*144, 3])[:,0]
diff = np.abs(prev_image.astype(float)-image.astype(float))
diff = diff[diff>35]
# print diff.shape[0]
prev_image = image
return diff.shape[0] > 200

def write_video(stream, fname):
# Write the entire content of the circular buffer to disk. No need to
# lock the stream here as we’re definitely not writing to it
# simultaneously
with io.open(fname, ‘wb’) as output:
for frame in stream.frames:
if frame.frame_type == picamera.PiVideoFrameType.sps_header:
stream.seek(frame.position)
break
while True:
buf = stream.read1()
if not buf:
break
output.write(buf)
# Wipe the circular stream once we’re done
stream.seek(0)
stream.truncate()

with picamera.PiCamera(framerate=fractions.Fraction(’30/1′)) as camera:
dir = "/var/tmp/capture"
camera.resolution = (1280, 720)
camera.framerate = fractions.Fraction(’30/1′)
camera.vflip = True
camera.hflip = True
camera.start_preview()
seconds = 5
stream = picamera.PiCameraCircularIO(camera,seconds=seconds, bitrate=8000000)
print "[ Buffer %s seconds/%d bytes ]" % (seconds, stream.size)
camera.start_recording(stream, format=’h264′, bitrate=8000000)
try:
while True:
camera.wait_recording(1)
if detect_motion(camera):
print "Dumping."
# generate a filename…
base = ‘cam_’+dt.datetime.now().strftime("%H%M%S")
part1 = os.path.join(dir, base+"-A.h264")
part2 = os.path.join(dir, base+"-B.h264")
camera.split_recording(part2)
write_video(stream, part1)
camera.wait_recording(15)
while detect_motion(camera):
camera.wait_recording(1)
camera.split_recording(stream)
with open("files.txt", "a") as f:
f.write("file %s\n" % part1)
f.write("file %s\n" % part2)
print "Dumped %s %s" % (part1, part2)
# Copy files to remote server
dst = ‘markv@conceptron.local:capture’
print "Copying %s to %s…" % (part1, dst)
rc = subprocess.check_call([‘scp’, ‘-p’, ‘-q’, part1, dst])
if rc != 0:
print "PROBLEM: (rc = %d)" % rc
else:
os.unlink(part1)
print "Copying %s to %s…" % (part2, dst)
rc = subprocess.check_call([‘scp’, ‘-p’, ‘-q’, part2, dst])
if rc != 0:
print "PROBLEM: (rc = %d)" % rc
else:
os.unlink(part2)
# ready to record some more…
camera.wait_recording(seconds)
finally:
camera.stop_recording()
[/sourcecode]

This would not be possible without the awesome picamera Python module and lots of careful engineering by the Raspberry Pi + Camera designers. They clearly foresaw this kind of possible application, and did everything that they needed to make it run efficiently and reasonably.

A few more short notes:

  • The motion detection code is terrible. It works after a fashion, but clearly could be tuned better.
  • To save space on my Pi, after capture it uploads each video file to one of my local servers, and then delete the file. I hardcoded it to use scp via subprocess. If you want to do something else, you can figure out what that might be and do it there. It won’t record new video while the scp is occurring: you could spawn a thread or some such to handle the copy and then dump back to the loop if you like.
  • You might want to write to a tmpfs file space, so it doesn’t eventually wear out your flash card with repeated writes and deletes, particularly if you can transmit these video files off as they are generated.
  • The picamera documentation is quite helpful. Indeed, it was my reading of that documentation which formed the basis of this initial script, which likely could not have been done (or not as easily) without them.

I will probably produce a tidier, better annotated version of this code and put it on github soon.

Hope this is of interest to some of you.

Addendum: If you want to see what the hardware looks like, you can see it here. Really just a cardboard box holding a pi, a powered hub, and the pi camera taped to the top, hung in the window.

Beginning to look at MQTT…

April 6, 2015 | Embedded, Internet of Things, My Projects | By: Mark VandeWettering

My weekend experiments lead me eventually toward flashing nodemcu, a Lua based firmware that runs on the ESP8266. Having a simple programming language (albeit one I’m not super fluent in) is very cool, and enables a whole bunch of nifty experiments.

While reading up, I encountered the acronym MQTT again. From Wikipedia:

MQTT (formerly Message Queue Telemetry Transport) is a publish-subscribe based “light weight” messaging protocol for use on top of the TCP/IP protocol. It is designed for connections with remote locations where a “small code footprint” is required and/or network bandwidth is limited. The Publish-Subscribe messaging pattern requires a message broker. The broker is responsible for distributing messages to interested clients based on the topic of a message. Andy Stanford-Clark and Arlen Nipper of Cirrus Link Solutions authored the first version of the protocol in 1999.

So, I thought I’d experiment a bit. I installed an MQTT broker called Mosquitto on a spare Raspberry Pi along with some simple command line clients and the python library with this command:

sudo apt-get install mosquitto mosquitto-clients python-mosquitto

The mosquitto broker is started automatically. In its simplest form, you can run a client that “subscribes” to a given “topic” in a shell window.

mosquitto_sub -d -t hello/world 

The -d specifies a slightly noisy “debug” mode. You’ll keep keepalive ping messages from the broker. The -t specifies the topic as something that looks like a path expression. You can specify more than one pattern, and there are expressions to wild card match subexpressions within the pattern. By default, these will connect to a broker on your localhost, but if you specify a -h flag, you can contact brokers running remotely (indeed, this will be the common case for IoT applications.)

From another window, you can “publish” to all clients which are subscribed to this topic:

mosquitto_pub -d -t hello/world -m "Hello world."

This sends the message to anybody subscribed to hello/world. You can send pretty much anything: it’s up to the application to know what to expect and what to do with it. You can even send the contents of complete files.

mosquitto_pub -d -t hello/world -f sendme.txt

So far, it doesn’t look all that exciting, but looks can be deceiving. First of all, the client code is relatively small and simple. This means that it’s easy to put into the small, low memory, low power nodes that you will typically use as sensors. There are already libraries for the Arduino and as I mentioned at the start, it’s already builtin to the nodemcu firmware for the ESP8266. Because it’s lightweight, even very modest brokers can handle huge numbers of messages. To give you some idea of scaleability, Facebook Messenger uses MQTT to route messages between users with very low latency, and without chewing the battery power on your mobile device. And it enables two way communications to sensor nodes, clients can both publish and subscribe at the same time.

As an experiment, I’m thinking of creating a notification service for my hummingbird camera that will update me when new captures occur. It seems pointless (and is) but it’s all a learning experiment. Eventually, I’m sure I’ll get back to the ESP8266.

Oh, and incidently, I’ve ordered some MCP23017 I2C I/O extenders, and hope to experiment with those on the ESP8266. They are nifty little chips which provide 16 additional digital I/O lines all under I2C control. Combined with the ESP8266, you’ll be able to sense lots of buttons and light lots of LEDs. Perhaps with the addition of an I2C A/D converter (or any other I2C sensor), you can do some serious tinkering.

Changes to the hummingbird camera…

April 5, 2015 | Amateur Radio | By: Mark VandeWettering

Okay, I got it realigned to better center the frame, and expanded it out to 4×3 aspect ratio. Minutes later, I had got a new capture.

hum

Lots of fun!

Addendum: Within the next hour, I got the following. I used cropping to give you the image of the bird at the full resolution.

bird

Addendum2: Hey, there is at least a pair of them.

two

Another hummingbird cam, this one of a hummingbird nest!

April 4, 2015 | Links | By: Mark VandeWettering

Shelby noticed that hummingbirds had made a little nest, so she got a webcam and now you can view it via live streaming. At the moment, I looked, there didn’t seem to be much action, but I’ll be checking in frequently. Awesome.

Shelby’s Hatchicam

The Clickspring Youtube channel

April 4, 2015 | Machining | By: Mark VandeWettering

I’ve tinkered a bit with metal working over the years, but never really developed any skill. I’ve always been fascinated by precision machining, particularly those that make clocks and watches. Yesterday, I discovered the Clickspring Youtube channel, which is really, really good. I’ve linked to his video where he makes washers and screws for a skeleton clock project, but there are lots of other videos with different kinds of machining in them. Enjoy!

Morning visit by a hummingbird…

April 4, 2015 | My Photos, My Projects | By: Mark VandeWettering

I know, I know, it’s probably getting a little repetitive and boring. But I’m still getting a kick out of my motion capturing hummingbird camera. Improvements are coming.

bird