Temperature And Humidity Display Using Arduino, DHT22, And MAX7219 Display

I finished another project today. This time it’s a simple temperature and humidity display, and so far it’s working pretty well. It’s built around an Arduino Nano and uses a DHT22 sensor. The display is an extremely cheap MAX7219-based four-module LED matrix (130x32mm), and its brightness is controlled by a capacitive touch sensor (11x15mm).

In this case, everything runs off a 5V supply so there are no level shifters needed and a single USB cable can power the whole thing. Any other 5V Arduino with hardware SPI will work fine here, too. The software libraries I used also support software SPI but I haven’t tried that out.

Here’s the layout:

  • Everything is connected to the +5V and GND pins on the USB connector.
  • MAX7219 CLK to Arduino 13.
  • MAX7219 DATA to Arduino 11.
  • MAX7219 CS to Arduino 10.
  • DHT22 I/O to Arduino 2.
  • Touch sensor I/O to Arduino 4.

I had everything on a breadboard but forgot to take a picture before wiring everything up to fit in the case… here it is wired up and just before being prepared to put into the case.

Arduino DHT22 ST7735

The program uses the MD_MAX72XX and MD_PAROLA libraries for the display, and the SimpleDHT library for the DHT22. It took me a while to wrap my head around the MD_PAROLA stuff, but the examples included with the libraries were very helpful. Here’s the program:

/* Temp and RH DHT22 MAX7219 for Dot 04
 *  Uses Nano to check DHT22 and display on 8x8 dot matrix (x4) MAX7219.
 *  Meant to be used indoors.
 *  Has two brightness settings, 4 and 15 (on scale of 0-15)
 *  Runs off 5V USB.
 *  MAX7219 controlled by MD_Parola and MD_MAX72xx libraries
 *  DHT22 using SimpleDHT
 *  PINS:
 *  DHT22 data: D2
 *  MAX7219 clock: D13, data: D11, CS: D10
 *  Intensity: D4
 *  Uses a MAX7219 32x8 LED module from Banggood. Hardware type is MD_MAX72XX::ICSTATION_HW, 4 devices
 *  Puts temp and RH on display at same time

 * MAKE SURE YOU RUN THE MD_MAX72XX_HW_Mapper to confirm the hardware setting for your particular display!
 * The results I got for the display I have were:
 * Your hardware matches the setting for IC Station modules. Please set ICSTATION_HW.


#include <MD_Parola.h>
#include <MD_MAX72xx.h>
#include <SimpleDHT.h>
#include <SPI.h>

// Set up DHT22 vars for data TX/RX
#define h_w 8
#define h_h 8
static unsigned char h_w_bits[] = {
   0x3c, 0x42, 0xa5, 0x81, 0xa5, 0x99, 0x42, 0x3c };

#define s_w 8
#define s_h 8

static unsigned char s_w_bits[] = {
   0x3c, 0x42, 0xa5, 0x81, 0x99, 0xa5, 0x42, 0x3c };

// Create instance for the DHT22 using pin 2 for data xfer
SimpleDHT22 dht22(2);

// Define the number of devices we have in the chain and the hardware interface
// NOTE: These pin numbers will probably not work with your hardware and may
// need to be adapted
#define HARDWARE_TYPE MD_MAX72XX::ICSTATION_HW  // Found using the MD HW mapping program

#define MAX_DEVICES 4 // Four 8x8 modules on this particular board

#define CLK_PIN   13
#define DATA_PIN  11
#define CS_PIN    10

#define BRIGHT_PIN 4

// Hardware SPI connection

byte CountUp = 0;

void setup() {

  delay(500); // Need this because display doesn't seem to start up right away.

  P.begin(2); // Using 2 zones, one for temp, one for humidity

  P.displayZoneText(0, "Hi!", PA_CENTER, 75, 0, PA_PRINT, PA_NO_EFFECT);
  P.displayZoneText(1, "Hi!", PA_CENTER, 75, 0, PA_PRINT, PA_NO_EFFECT);

  P.setZoneEffect(0, 1, PA_FLIP_UD);  // Need this because I glued the display in upside down >:-(
  P.setZoneEffect(1, 1, PA_FLIP_UD);  // Need this because I glued the display in upside down >:-(
  P.setZoneEffect(0, 1, PA_FLIP_LR);  // Need this because I glued the display in upside down >:-(
  P.setZoneEffect(1, 1, PA_FLIP_LR);  // Need this because I glued the display in upside down >:-(

void loop() {


  // If touch sensor is active, cycle through the 16 levels of brightness until sensor is inactive.
  int brightness_change = digitalRead(4);
  while (brightness_change == 1){
    if (CountUp == 16){
      CountUp = 0;
    P.setIntensity(0, CountUp);
    P.setIntensity(1, CountUp);
    CountUp = CountUp + 1;
    brightness_change = digitalRead(4);

  float temperature = 0;
  float humidity = 0;
  int err = SimpleDHTErrSuccess;
  if ((err = dht22.read2(&temperature, &humidity, NULL)) != SimpleDHTErrSuccess) {
    // If we're here, there was a problem reading the DHT22. Show an error then try again.
    P.displayZoneText(0, "Dht", PA_CENTER, 75, 0, PA_PRINT, PA_NO_EFFECT);
    P.displayZoneText(1, "Err", PA_CENTER, 75, 0, PA_PRINT, PA_NO_EFFECT);
    goto jumpback;  // I know, I know. Don't say it...

  // Convert the float to a string to display
  char temp_result[6];

  // Convert the float to a string to display
  char hum_result[6];

  P.displayZoneText(1, hum_result, PA_CENTER, 75, 0, PA_PRINT, PA_NO_EFFECT);
  P.displayZoneText(0, temp_result, PA_CENTER, 75, 0, PA_PRINT, PA_NO_EFFECT);

  delay(3500);  // DHT22 max sample rate is about 2 seconds.

If you are using a MAX7219-based display, save yourself some time and frustration by connecting it and running the MD_MAX72XX_HW_Mapper program that comes with the MD_MAX72XX library before you do anything else. It will tell you how your display is set up, regardless of how it actually looks.

After doing some testing, I found that the capacitive touch sensor I was using could reliably detect my finger out to about 5mm away. That was great because then I could hide it inside the case and there’d be no switch, no pad… just a “magic” spot on the back that changes the brightness if you put your finger there.

I designed a case for this particular project, including the specific display and touch sensor I had on hand. It’s vented, has a hole for a USB cable, and is closed up with four 6mm M3 screws:

Arduino DHT22 MAX7219
Fresh off the printer… with a bit of over-extrusion.
Arduino DHT22 MAX7219
The capacitive touch sensor in its dedicated spot right in the middle of the back panel.

With everything wired up and tested, I hot-glued everything… and I mean everything. Every connector, every module (except the Nano’s mini-USB port – never know if I’ll want or need to reprogram it) … it’s all quite secure inside the case. I glued put a piece of plastic on the back of the display just in case any other parts work their way loose and came in contact with it. I’m still a little wary of doing things this way, but it sure beats drawing up and etching boards for this kind of stuff!

Once the glue had cooled and I confirmed everything was stuck good and tight, I closed up the case and plugged the cable into a 5V USB power supply. The LEDs flashed, and then… everything was upside down. I’d glued the display in upside down.

So… another 45 minutes or so of pondering and looking and I found how to flip the display in software so it looked right again. If you run into this problem, check out setZoneEffect() in the MD_PAROLA documentation.

Here it is, from the back:

Arduino DHT22 MAX7219

And from the front, display pointing the right way:

Arduino DHT22 MAX7219

The STL files for the case are available at https://www.thingiverse.com/thing:4202464

ESP32-CAM Low Power Trail Camera

I’ve been spending a lot of time lately working with the ESP32-CAM module. It doesn’t produce the best pictures I’ve seen, but for the cost (I’ve found them for $9, including the OV2640 camera!) and the number of features and horsepower, they’re tough to beat.

One of the things I want to do with them is put a couple outside and get pictures of the different kinds of animals that wander through the yard and leave footprints in the snow. When I mentioned this to a buddy of mine, he immediately wanted to know if they could be used to watch for motion and take pictures in case someone was trying to break into his shed or cabin. Sure, I told him – I didn’t see any reason why not. His response was to immediately ask me how many I could make for him and how quickly I could do it.

Unfortunately, it was just an idea at the time and I hadn’t actually tried to do it. I figured it wouldn’t be too tough – after all, the ESP32-CAM AI-Thinker modules I use have several GPIO pins broken out. I was wrong.

Turns out some of the GPIO pins are used by the camera, and the rest are used by the uSD card slot that’s on the board. One of them (D4) seems to be used by BOTH the camera (camera flash) and the uSD card slot (data line).

I tried a bunch of things and didn’t have much luck, and when I looked around for information, there were lots of links but I couldn’t find any information that quite fit what I was doing.

Finally, I found a link to some ESP documentation, which got me started. Looking into the various ESP libraries for the SD card, using a FAT32 filesystem, the camera, and the on-board EEPROM took a while but after I figured one or two of them out, the others were easier.

After going through my various parts bins, I cobbled together a circuit that seems to reliably work. Here’s the schematic of the whole thing:

Trailcam v1 schematic

+V on the schematic is the power supply you want to use. Power for the ESP32 first goes to an AMS1117-3.3 regulator. According to the AMS1117 datasheet, it will run to where the input is only 1V higher than the output. The output is 3.3V, so it should be able to run down to 4.3V. The absolute maximum input voltage is 15V, so powering it from 4xAA/4xC/4xD alkaline batteries (6V) is fine. Even 9 or 12V should be OK, but check the regulator on your board first to make sure.

Remember that if you want to power it from rechargeable NiCd or NiMH batteries, those are 1.2V, not 1.5V, so you’d probably want to use five of them, instead of four alkalines. Same with those long-life lithium batteries – they’re 1.2V too.

The block labelled “OPTIONAL” is there if you want a switch that will keep the MOSFET (and ESP) turned on while programming. You can also just move the ESP GND pin from the MOSFET drain to ground while programming. Or… if your PIR will remain on while motion is present, you can just wave your hand above the sensor until programming is done. That’s what I do.

R1 and R2 are necessary, particularly if you have a MOSFET with a high input capacitance. They keep the MOSFET gate from momentarily pulling a large amount of current which could damage the PIR or ESP.

Why the 4N37? Because the ESP is not connected to the ground rail when the MOSFET is turned off, so GPIO13 does a very noisy “high-ish” float which leads to unpredictable results. Note that the 4N37 diode side circuit goes from GPIO13 to the anode, the cathode goes to R3, and R3 goes back to the GND pin on the ESP – NOT the ground rail from the power supply!

This circuit works best with low-Vf Schottky diodes that can tolerate a reverse voltage at least 2x the highest possible voltage in your circuit. Tested and works with BAT41 and 11DQ06. Tested and works with old-school 1N34 germanium diodes (but I probably wouldn’t use them for real-world applications). It seems to mostly work but not as reliably with typical 1N914 and 1N4001 diodes. It does NOT work with anything with a Vf larger than about a volt, like LEDs.

This circuit also works best with MOSFETs that are fully on at a low voltage, like 2.5 to 4.5V, and have a very low drain-source resistance when on (tens to a couple hundred milliohms). Tested and works with IRLI640G and DMN1019USN-7.

Normally I’d put bypass capacitors across the 5V and GND pins of the ESP, but here’s the ESP32-CAM power circuit:

ESP32-CAM power supply circuit
From: https://github.com/SeeedDocument/forum_doc/raw/master/reg/ESP32_CAM_V1.6.pdf

Note the abundance of capacitors, which is pretty great. A 0.1uF capacitor probably wouldn’t hurt, but I don’t think it’s necessary unless you’re using a very noisy power supply.

So that’s the schematic. Breadboarded up, this is how it looks:

Trailcam v1
That loose wire is connected to GPIO0 and is connected to GND to program the ESP32
Trailcam v1
That little chunk of PCB with a blue wire on it is a nasty but functional tiny-surface-mount-to-breadboard converter

Here’s how it works:

  • When power is applied to the circuit, the gate of the MOSFET is low and doesn’t conduct, so the ESP is disconnected from the ground rail. The circuit pulls about 16uA at this time.
  • When the PIR sensor detects motion, its trigger pin goes high for two seconds. This signal is sent through a diode and 47k resistor to the gate of the MOSFET, which turns it on.
  • With the MOSFET turned on, the ESP now has power and boots. As soon as it can, it sets GPIO13 high. This turns on the input of a 4N37 optoisolator, which turns on its output transistor. The output transistor is also connected to the gate of the MOSFET through a diode and a 47k resistor. This keeps the MOSFET turned on even after the PIR trigger line goes low.
  • The ESP goes through the paces of checking for and mounting the uSD card, starting up the camera, and checking the EEPROM and reading the number from it that indicates the last picture that was taken (PIC_COUNT) prior to the previous shutdown.
  • If there is a problem at any point in the startup, the ESP will set GPIO13 low, which will turn it off and wait for the next trigger to boot again. I know it’s 2020, but sometimes a reboot still fixes things.
  • If there are no problems, it gets to the main loop, which takes the number of pictures specified by a while loop that increments the variable COUNTUP (in the program here it takes five pictures). Each time through, the picture counter (PIC_COUNT) is increased by 1. The circuit pulls about 130-140mA at this time.
  • Once COUNTUP reaches the maximum set in the while loop, the ESP saves the current value of PIC_COUNT to the EEPROM and then sets GPIO13 low. This should turn off the MOSFET and remove power from the ESP.
  • If there is still power, the ESP waits for 500ms after it tried to shut itself down. If it’s still awake, then that means the PIR has either re-triggered or is still triggered so it’s a good idea to get more pictures. The ESP sets GPIO13 high again and loops back to take another five pictures.

Here’s what it looks like when it’s running. Note the camera flash LED on the board glowing faintly instead of blazing like a million suns like it usually does. When the LED is on, the ESP is communicating with (and hopefully writing an image to) the uSD card.

You may be wondering why it works, though. After all, there don’t seem to be any usable free GPIOs when using both the camera and the uSD card, so what’s with GPIO13? Well, it comes down to changing this line:


to this:


Selecting “true” tells the ESP to talk to the uSD card in 1-bit mode instead of the usual 4-bit mode. This frees up a couple of GPIO pins, one of which is GPIO13. The disadvantage to using 1-bit mode is that it’s slower, but I’m pretty sure the ESP itself is going to be the bottleneck here. Plus, the OV2640 and lens aren’t super-high quality so setting the JPEG image quality high and making huge files isn’t necessary (or useful).

Here’s the program in its entirety (it looks kind of mangled but if you copy and paste it into a text document or the Arduino IDE it comes out properly):

/* ESP32-CAM Low Power Trail Camera v1
 * Mark's Bench (http://marksbench.com)
 * Uses ESP32-Cam AI-Thinker with OV2640 camera to take pictures and save to SD card when triggered by PIR
 * ESP32 is connected to power rail via MOSFET. MOSFET is initially turned on by PIR trigger and kept on by setting pin D13
 * on the ESP32 high as soon as ESP starts up.
 * ESP then takes 5 pictures and saves them to the SD card, after which it sets D13 low, which turns the MOSFET off,
 * cutting the ESP off from the GND rail.
 * If the PIR trigger remains high or goes high again during when the ESP32 would shut down, then take another five
 * pictures and try shutting down again.
 * Advantage to this scheme is a power savings - power draw is less than 20uA when the MOSFET
 * is off and the ESP32 is shut down. Power draw is around 130-140mA when pictures are being taken and saved.
 * Disadvantage is that when using both the camera and the SD card, there are no easily usable GPIO pins available.
 * To get around this, use 1-bit SD card access instead of the usual 4-bit. 
 * It slows things down but you can still get an image with ok quality about once a second
 * at full resolution (1600x1200) on the OV2640.
 * Information on and code examples for using the ESP32-CAM library:
 * https://github.com/yoursunny/esp32cam
 * https://github.com/espressif/esp32-camera
 * https://github.com/espressif/arduino-esp32/tree/master/libraries/ESP32/examples/Camera/CameraWebServer
 * Information on and code examples for using the ESP32 SD_MMC library:
 * https://github.com/espressif/arduino-esp32/tree/master/libraries/SD_MMC
 * Information on using the ESP32 EEPROM (the Preferences library):
 * https://github.com/espressif/arduino-esp32/tree/master/libraries/Preferences
 * VERY useful ESP32 documentation:
 * https://www.espressif.com/en/support/download/documents
 * In particular, the "ESP32-WROOM-32 Datasheet", "ESP32 Datasheet", and "ESP32 Hardware Design Guidelines".
 * Also, the schematic at:
 * https://github.com/SeeedDocument/forum_doc/raw/master/reg/ESP32_CAM_V1.6.pdf
 * And the specification page (which has some errors) at:
 * https://github.com/raphaelbs/esp32-cam-ai-thinker/blob/master/assets/ESP32-CAM_Product_Specification.pdf
 * ***TO PROGRAM: Set Board to "AI Thinker ESP32-CAM"***

#include <esp_camera.h>
#include <FS.h>
#include <SPI.h>
#include <SD_MMC.h>
#include <Preferences.h>

// The ESP32 EEPROM library is deprecated. Use the Preferences library instead.
Preferences preferences;

// The following defines are for the ESP32-Cam AI-THINKER module only. I haven't tried any others.
#define CAM_PIN_PWDN    32
#define CAM_PIN_RESET   -1
#define CAM_PIN_XCLK    0
#define CAM_PIN_SIOD    26
#define CAM_PIN_SIOC    27
#define CAM_PIN_D7      35
#define CAM_PIN_D6      34
#define CAM_PIN_D5      39
#define CAM_PIN_D4      36
#define CAM_PIN_D3      21
#define CAM_PIN_D2      19
#define CAM_PIN_D1      18
#define CAM_PIN_D0       5
#define CAM_PIN_VSYNC   25
#define CAM_PIN_HREF    23
#define CAM_PIN_PCLK    22

// Create a variable to hold the picture number. Since the SD card is formatted FAT32, the maximum number of files
// there can be is 65534, so a 16-bit unsigned number will be fine.
uint16_t PIC_COUNT = 0;

void setup(){
  pinMode(13, OUTPUT);    // GPIO13 available when using SD_MMC.begin("/sdcard",true) for 1-bit mode (set below)
  digitalWrite(13, HIGH); // Hold the gate of the MOSFET high as soon as possible after boot to keep the power on
                          // after the PIR is done triggering.
  //Serial.begin(115200); // Uncomment for troubleshooting

  preferences.begin("trailcam", false); // Open nonvolatile storage (EEPROM) on the ESP in RW mode
  PIC_COUNT = preferences.getUShort("PIC_COUNT", 0);  // Get the stored picture count from the EEPROM.
                                                      // Return 0 if it doesn't exist.
                                                      // getUShort() fetches a 16-bit unsigned value

  // Now, configure the camera with the pins defined above and recommended settings for xclk, led_c, and format.
  camera_config_t config;
  config.pin_d0 = CAM_PIN_D0;
  config.pin_d1 = CAM_PIN_D1;
  config.pin_d2 = CAM_PIN_D2;
  config.pin_d3 = CAM_PIN_D3;
  config.pin_d4 = CAM_PIN_D4;
  config.pin_d5 = CAM_PIN_D5;
  config.pin_d6 = CAM_PIN_D6;
  config.pin_d7 = CAM_PIN_D7;
  config.pin_xclk = CAM_PIN_XCLK;
  config.pin_pclk = CAM_PIN_PCLK;
  config.pin_vsync = CAM_PIN_VSYNC;
  config.pin_href = CAM_PIN_HREF;
  config.pin_sscb_sda = CAM_PIN_SIOD;
  config.pin_sscb_scl = CAM_PIN_SIOC;
  config.pin_pwdn = CAM_PIN_PWDN;
  config.pin_reset = CAM_PIN_RESET;
  config.xclk_freq_hz = 20000000;
  config.ledc_timer = LEDC_TIMER_0;
  config.ledc_channel = LEDC_CHANNEL_0;
  config.pixel_format = PIXFORMAT_JPEG;
  // Make sure there is PSRAM available (the AI-Thinker module has PSRAM). Otherwise, don't go any further.
    config.frame_size = FRAMESIZE_SXGA; // If there's PSRAM then there's enough memory to capture up to 1600x1200
                                        // The following resolutions are available:
                                        // 96x96 (96x96)
                                        // QQVGA (160x120)
                                        // QQVGA2 (128x160)
                                        // QCIF (176x144)
                                        // HQVGA (240x176)
                                        // 240x240 (240x240)
                                        // QVGA (320x240)
                                        // CIF (400x296)
                                        // VGA (640x480)
                                        // SVGA (800x600)
                                        // XGA (1024x768)
                                        // SXGA (1280x1024)
                                        // UXGA (1600x1200) **Full-resolution for OV2640
    config.jpeg_quality = 10; // Valid: 0-63, with 0 being highest quality and largest file size.
                              // Anything lower than 8 creates large file sizes that take a long time 
                              // to save to the SD card. 
                              // The camera and lens aren't the best quality, so huge files
                              // won't get you a better picture beyond a certain point.
    config.fb_count = 2;  // With the PSRAM, there's enough memory for two framebuffers, which speeds captures.
      // The AI-Thinker module has PSRAM. I haven't tried any module without PSRAM.
      //Serial.println("NO PSRAM FOUND"); // Uncomment for troubleshooting

  // Start up the camera with the configuration settings made earlier in the "config." statements.
  esp_err_t err = esp_camera_init(&config);
  if (err != ESP_OK){
    // If we're here, there's a problem communicating with the camera.
    // Turn the ESP off and wait for the next trigger.
    digitalWrite(13, LOW);
    //Serial.println("CAM FAIL"); // Uncomment for troubleshooting
    while (true){
      // Need this loop to wait in case the PIR is keeping the power on.

  // Start up the SD card, using 1-bit xfers instead of 4-bit (set the "true" option). Frees up GPIO13.
    // If we're here, there's a problem with the SD card.
    // Turn the ESP off and wait for the next trigger.
    digitalWrite(13, LOW);
    //Serial.println("SD FAIL 1");  // Uncomment for troubleshooting
    while (true){
      // Need this loop to wait in case the PIR is keeping the power on.

  // Query the card to make sure it's OK
  uint8_t SD_CARD = SD_MMC.cardType();
    // If we're here, there's a problem with the SD card.
    // Turn the ESP off and wait for the next trigger.
    digitalWrite(13, LOW);
    Serial.println("SD FAIL 2");  // Uncomment for troubleshooting
      // Need this loop to wait in case the PIR is keeping the power on.

// We are now done the setup and should be ready to take pictures in the main loop() function.

void loop(){
  uint8_t COUNTUP = 0;  // Create variable to take multiple pictures.
  while (COUNTUP <=4){  // Take 5 pictures before shutting down.

    // Take picture and read the frame buffer
    camera_fb_t * fb = esp_camera_fb_get();

    if (!fb){
      // If we're here, there's something wrong with the data in the frame buffer.
      // Turn the ESP off and wait for the next trigger.
      digitalWrite(13, LOW);
        // Need this loop to wait in case the PIR is keeping the power on.

    // If we're here, the image was captured. Begin the process to save it to the SD card.
    // First, create the file name and path. Currently set to make files like /pic123.jpg
    String path = "/pic" + String(PIC_COUNT) + ".jpg";
    fs::FS &fs = SD_MMC;

    // Now, create a new file using the path and name set above.
    File file = fs.open(path.c_str(), FILE_WRITE);
      // If we're here, there's a problem creating a new file on the SD card. 
      // Turn off the ESP and wait for the next trigger.
      digitalWrite(13, LOW);
        // Need this loop to wait in case the PIR is keeping the power on.
      // If we're here, the file was created. Now write the captured image to the file.
      file.write(fb->buf, fb->len); 
      PIC_COUNT = PIC_COUNT + 1;  // Increment the picture count number each time there's a successful write.
      if(PIC_COUNT >=65500){
        PIC_COUNT = 0;  // FAT32 has a limit of 65534 files in a folder
    file.close(); // Done writing the file so close it.

    // Free the memory used by the framebuffer so it's available for another picture

    COUNTUP = COUNTUP + 1;  // We are done an image capture cycle. Increment the count.

  // If we're here then we've taken the pictures and we are ready to shut down. Write the current file number to
  // the EEPROM, then set D13 low.
  preferences.putUShort("PIC_COUNT", PIC_COUNT);  // Store the picture count number in the EEPROM
  // Normally you'd want to do a preferences.end() to properly close the EEPROM but since the intent is to
  // shut the ESP down, it's not needed, and not having to open and close it every capture cycle speeds things
  // up and saves some wear on the EEPROM.

  //Serial.println("Shutting down."); // Uncomment for troubleshooting
  digitalWrite(13, LOW);
  // The ESP should be shut down at this point. If the PIR is still triggered or has re-triggered and is keeping
  // the MOSFET on, then set D13 high and allow the program to loop again to take another five pictures.
  digitalWrite(13, HIGH);
  //Serial.println("Looping back.");  // Uncomment for troubleshooting


Hopefully there are enough comments to make heads or tails of it. A few notes:

  • This is for the AI-Thinker module with the OV2640 camera only. It’s the only one I’ve tried. It will probably work with other models of ESP32-CAM, but you will need to check the #define for each pin, see how much of what kinds of storage are available, and what settings the camera you’re using requires.
  • The ESP32 Arduino-compatible “EEPROM” library is deprecated; the new way to do things is with the “Preferences” library.
  • Before programming, set the board type to “AI Thinker ESP32-CAM”. Again, this is for the AI-Thinker module only.

If you’re looking for documentation on the ESP32-CAM or the libraries I used to get this working, it’s all in the following links. For the libraries, be sure to check out the examples and .h files for options and how various things work:

I haven’t put it outside yet, but as a test of the ESP and circuit, I changed the program so the ESP would take and save as many 1600×1200 JPEGs at compression level 9 as quickly as possible. The circuit was powered by four grocery-store branded AA alkaline batteries that were unused but of unknown age. It ran for 16 hours 21 minutes and took 23475 pictures before the batteries died. Not bad!

I know there is a lot of room for improvement – both in the circuit and in the program. When building this, I was limited to what I had on hand – a P-channel MOSFET might be a better choice, and I’m sure there’s a better way to do things than with the 4N37. For now, though, I’m pretty happy that it works reliably. Now I need to put it in a box and get some pictures of those critters in the yard.

If you made it this far, congratulations! If you build this or have a better way to do this, I’d be curious how it turned out! Feel free to drop a comment here or send me a message using the contact form!

Yes, You Can Print TPU On A CR-10s With No Mods (Part 2)

In my previous post, I showed that it was indeed possible to print TPU on a plain old CR-10s. That was with a couple of sample packs that had no name on them but had the following recommendations:

  • Nozzle: 220-240C
  • Bed: 75-85C

With that sample TPU, I had success printing at around 20mm/s with a nozzle temperature of 240C and a bed temperature of 75C.

I ordered the a spool of the cheapest black TPU I could find on Amazon, sold by a company named Priline. The recommendations for this TPU were different than the other stuff I’d used:

  • Nozzle: 190-230C
  • Bed: 50-80C

I thought I’d print the same nut and bolt models that I’d done before to compare. I made the temperature changes in Cura, then sent it off to print, first with a nozzle temperature of 220C and a bed temperature of 75C.

Right off the bat I could see there was a problem. The lines weren’t adhering to each other as they were being printed. I tried bumping up the flow rate, which only made things lumpier. Then I turned up the temperature a few degrees at a time until it looked like things were working better. Unfortunately, the print failed on the second layer when it didn’t stick to the first layer and became a blob on the nozzle.

I ran another print with the temperature set to 230C and with the flow rate still higher. The first layer went down a lot better and I thought things were going to work but the fourth layer didn’t stick to the third and it ended up all over the nozzle again. I thought that might’ve been a fluke, so I leveled the bed again and tried again but had the same results on the sixth layer.

Another print started at 235C and was working pretty well but once there were about a dozen layers put down, it didn’t look right. I cancelled the print, let everything cool down, and then took a look at the parts. With a bit of pulling, I was able to separate some of the layers. Still no good.

Despite the recommendations being only up to 230C, I tried bumping it up one more time to 240C, just like the TPU from the sample packs. That extra five degrees made a world of difference. There was some over-extrusion so I turned the flow rate back down a bit.

Here’s what I ended up with. You can see there’s a bit of stringing between the models just like last time:

TPU nut and bolt
TPU nut and bolt
TPU nut and bolt
They thread together quite nicely.

Here are two comparison pictures of the nut and bolt printed with the sample TPU on the left, and the Priline TPU on the right:

TPU nut and bolt
TPU nut and bolt

Both had some stringing, but it’s pretty obvious that the Priline printed cleaner than the sample packs did. From the look of it, I probably should’ve dried the sample TPU before I used it.

After the print, I compared my notes and found that the settings that had finally worked for the Priline were exactly the same as I’d used when printing with the sample filament:

  • Speed: 19mm/s
  • Nozzle temperature: 240C
  • Bed temperature: 75C
  • Flow rate: 105%
  • Retraction: OFF

These results make me even more confident in saying that yes, a stock CR-10s can print TPU and do a decent job at it.

Yes, You Can Print TPU On A CR-10s With No Mods

A good friend of mine has a dad who’s suffering from dementia. He’s a farmer and spent decades building and fixing things, and he still likes tools and things he can manipulate with his hands. Unfortunately, there are times when he throws things.

I printed him up some big nuts and bolts in PLA but realized that if they were all screwed together they’d make a pretty hefty projectile. So I wasn’t entirely sure what to do. Then, I remembered there were some 30g sample packs of TPU sitting around and collecting dust, so I figured I’d see if I could do something with them.

Having never printed TPU before, it was a few hours worth of DuckDuckGoing (I know it doesn’t roll off the tongue as well) before I’d learned that yes, it was possible to print TPU on a CR-10. I’d also learned that no, it wasn’t possible to print TPU on a CR-10. Interestingly, it was also possible to print TPU on a CR-1o, but only if you installed anywhere from $0.50 to $250 worth of modifications.

I opened one of the packages and played with the filament. Rubbery, stretchy, a little squishy… definitely different from the PLA and PETG I’m used to. So, I loaded it into the printer, started a print, and sat there for the entire thing so I could dial the settings in while it printed (thank you Octoprint!!!).

Here’s a bendy wrench:

TPU wrench
Before [model by triffid_hunter , see https://www.thingiverse.com/thing:11647]
TPU wrench
TPU wrench
… and it springs back on its own

I then fused what was left of one pack with the other pack I had (and set off the smoke alarm, whoops). After things calmed down, I drew up and printed a nut and bolt, which came out really well:

TPU nut and bolt
TPU nut and bolt

I did this on my CR-10s with a 0.4mm nozzle and no modifications, and the TPU I used had the following recommendations listed on the pack:

  • Colour: Black
  • Material: TPU
  • Dia: 1.75mm
  • Nozzle: 220C-240C
  • Bed: 75C-85C

After a bit of experimentation, here’s what I found worked for me. Again, this is on a stock CR-10s with a 0.4mm nozzle:

  • Level your print bed. Actually, go level it now even if you’re not going to print TPU. It fixes sooo many problems.
  • Clean the outside of the nozzle before you print.
  • Purge whatever was in there before. Do not mix TPU with another material, even if it’s “just a bit”. Trust me.
  • Bed surface: Glass with two layers of Elmer’s all-purpose glue stick, applied after the bed is at temperature.
  • Bed temperature: 75C (all layers).
  • Nozzle temperature: 240C (all layers).
  • Fan: 0% for first layer, 100% for rest of print.
  • Flow rate: 105% (all layers).
  • Retraction OFF.
  • Print speed: 18-21mm/s.
  • Layer height: 0.2mm for first layer, 0.25mm for rest of print.
  • Print with a skirt, at least 7-10 lines wide.
  • After the hotend and bed are heated up and just before you’re ready to print, raise the hotend 100-150mm and wait until the nozzle stops drooling TPU. Clean up the debris, carefully wipe the excess TPU from the nozzle, then start the print. It will take a while for the hotend to fill up again – print with a skirt to give it time to fill back up (see above point).

With these settings on this printer and with that particular flavour of TPU, I was able to get good strong prints that looked pretty good. There is some stringing between parts, but it cuts away easily with a small pair of scissors or snips.

I lobbed the wrench at my wife, who reported that it didn’t hurt. I screwed the nut onto the bolt and threw it at the front door and it bounced nicely without leaving a dent. I think these might work for my friend’s dad.

I’ve ordered some more TPU, and of course I couldn’t find stuff with the same temperature recommendations as the sample packs. I will give it a shot and do up another post with what I find out. At this point, though, I’m pretty comfortable with saying that yes, you can print TPU on an unmodified CR-10s and it can turn out well. Just go slow. And level the print bed!

Portable Power For The GQRX Pi 4

I’ve been enjoying finding and listening to all kinds of stuff with the SDR, and since I got it working with the Pi 4 I’ve wanted to use it without needing an extension cord.

I had a lot of trouble finding a battery supply that would do the trick. I have a few of those USB power packs at home and tried them but the Pi kept reporting low voltage.

I then turned to putting a regulator on a 12V SLA battery. Unfortunately, even with capacitors and shielding, the switching regulator I tried put out a lot of noise that the SDR picked up. I knew a linear regulator would be quieter but as I suspected, it only took about a minute before the biggest heatsink I had was too hot to touch.

So… I went back to the USB battery packs.

There are a few problems with those packs. They can also be noisy (there’s switching circuitry in them too), they can sag under load, and most of the USB cables out there are cheap crap that use very thin wire.

I put a load (actually, a Pi 2) on each pack and watched what happened to the noise and voltage on the oscilloscope. The best of the lot turned out to be an older Anker Astro E4 13000mAh unit that held a pretty constant 4.92V and wasn’t too noisy. So I started there.

I don’t know how many USB cables (or pieces of USB cables) I have sitting around. Some came with phones, tablets, or other devices… some were bought separately… some looked good… some looked cheap. I started going through the cables to see what kind of voltage drop there was when there was a Pi 4 on the end (with a micro-USB to USB-C adapter).

None of them ran the Pi without triggering the low voltage warning, and some of them couldn’t run the Pi without triggering the voltage warning even when idle. Two of them were so bad that the Pi couldn’t finish booting. The voltage drop across the cables was as much as 0.62V!

With those results, I decided to make my own cable. Unfortunately, when I looked in my USB parts drawer, I only had micro-USB plugs and USB type A jacks.

Out came the snips and I started chopping up the cables, starting with the ones that looked the best. Turns out that how a cable looks doesn’t mean much when it comes to how heavy the wire inside it is.

Eventually I found one that had considerably heavier wire than what I’d seen up to that point, so I decided to use it instead of chopping up the rest of the cables. I cut it to 60cm, soldered on the plug end, and gave it a try.

It was a lot better, but the Pi was still reporting that there’d been a low voltage condition at some point. I cut the cable to 50cm.

Then 40cm.

Then 30cm.

Then 20cm.

20cm did the trick, and I couldn’t trigger the low voltage warning anymore, even with the SDR plugged in and running and the CPU pinned to 100% (I usually use cat /dev/urandom > gzip > /dev/null for that).

Here it is, the beautiful and reliable USB cable of portable GQRXing:

Yeah, that’s hot glue. Works well and it’s strong but doesn’t look all that great…

Since I only had the micro-USB plugs on hand, I still have to use the adapter, which could also be wasting a bit of power. I need to order some other stuff sometime soon so I may grab a couple of parts to make another good cable or two.

To test the cable and battery pack, I hooked it up to my SDR Pi, fired up GQRX, told it to record the audio, and checked in on it every half hour. It ran for six hours before the battery LEDs showed it was at less than 25% capacity. I don’t like running those packs flat so I stopped the test there.

Before I shut down the Pi, I hopped onto it (using VNC on my phone, heh) and checked whether any of the warning conditions had been triggered (voltage, temperature, etc). Here’s what I saw:

0x0, or no problems at all… after running for six hours straight. Not too shabby!

GQRX On The Raspberry Pi 4

I’ve been playing with SDRs on the Raspberry Pi 4 for almost a month now and I am happy to say that the new Pi (at least the 2GB model) has enough oomph to run GQRX with a Great Scott Gadgets HackRF One (at 4MSPS max), and a NooElec NESDR SMArt. I was SO happy to say it, I spent several hours writing up a howto, just for everything to fall apart at the second last step.

So… here’s a short version. If I get enough questions about it, I’ll look at salvaging what I still have and writing it up in detail again.

To get GQRX working on the Raspberry Pi 4B (the 2GB is the only version I have), you will need the following:

  • A Raspberry Pi 4B (I only tested with the 2GB version but I expect it will work fine with both the 1GB and 4GB versions)
  • An 8GB Class 10 (or larger/faster) microSD card
  • Internet connectivity through a wired Ethernet port
  • An Ethernet cable
  • An SDR (tested with the HackRF One and the NESDR SMArt)
  • An antenna that fits your SDR
  • A USB cable that fits your SDR (if necessary)
  • Headphones or a set of speakers with a 3.5mm plug
  • A heatsink and/or fan for the Pi (it requires so little cooling that you can just point a desk fan in its general direction and it should be okay)

So, here’s how to do it:

  • Image the microSD card with the official Raspbian version with desktop but without the extra software. Get it from here.
    Get this version of Raspbian
  • Connect it to an Ethernet port and power it up, then find it on your network and do the usual password/configuration/update steps.
  • Set the resolution to 800×600.
  • Force the audio output to the headphone jack.
  • Enable VNC.
  • Install GQRX from the respository (sudo apt install gqrx-sdr).
  • Connect to the Pi via VNC and turn the system volume down.
  • Connect your SDR (with antenna)
  • Run GQRX. If you’re having trouble or for information on settings, etc, check here for information.
  • In the “FFT” tab, change the
  • Turn up the system volume and/or software or hardware amplifiers in GQRX and your SDR until you can comfortably hear a signal.

And that’s about it.

I found, though, that I wanted to make the Pi a little more portable, and when using the Ethernet jack there seems to be a bit more noise in the GQRX display. Here’s what I did:

  • Install RaspAP, information and installation instructions are here.
  • Set up the Pi to be a wireless access point on a separate network from your Ethernet port.
  • Once the AP is running, disconnect the Pi from the Ethernet port and, using your computer, phone, or tablet, look for the SSID of the AP you set up and connect to it.
  • Once connected, run a VNC client (I use VNC Viewer by RealVNC but others should work as well).
  • I lowered the output power of the Pi’s wifi transmitter because I wanted to save as much power as possible, lower the amount of electrical noise the Pi generated, and keep the AP’s range as small as possible so as few people as possible can see it. To do that, add the following line to /etc/rc.local just before the “exit 0” line:
    iwconfig wlan0 txpower 5
    Change the value for txpower to whatever suits your needs (lower = lower transmitter power).
  • If you find you can’t connect to the AP or you want to change or update the Pi, you can always plug it back into the Ethernet port and connect to it that way.

Here’s what you’ll end up with (except with sound – my PC’s microphone isn’t working):

You can also ignore that last block of instructions and just hook the Pi up to an internal or HDMI-attached display. Some people find that using displays with resolutions of 800×600 or less cramps the GQRX display and makes navigation difficult. If you’re running VNC, you can set the resolution to whatever you want – just keep in mind that the Pi will start to sweat and lag if the display is too large.

If the Pi seems to be having trouble keeping up, try the following:

  • Make sure your Pi is adequately cooled and powered. Run vcgencmd get_throttled and if it shows any number other than 0x0, you have a temperature/power problem.
  • As mentioned above, lower the display resolution.
  • Remove the Raspbian desktop picture and replace it with a single colour.
  • In GQRX, one of the biggest CPU hogs is the main display. Go into the “FFT Settings” tab and lower the FFT Size and Rate. I use 8192 and 10fps on my Pi and that seems to work reasonably well.
  • Once you find the frequency you want to listen to, minimize GQRX. That should speed things up noticeably.
  • Lower GQRX’s sample rate by going into the “Configure I/O Devices” window (the little green PCI card icon) and lower the number. The HackRF in particular needs to be set lower than its 10MSPS default. 3-4MSPS for the HackRF and 2-3MSPS for the NESDR SMArt seems to do the trick.
  • Don’t run any extra services or applications on your Pi.
  • If you’re recording audio and it’s lagging or chirping, don’t record to the SD card – mount external USB storage (stick, drive, whatever) and use that instead.

The Pi – even the 4B – has its limitations. If you’re careful, though, you can pretty easily turn it into a small, portable, and powerful little SDR machine.

World’s Crappiest Oscilloscope… v1

When Ms Geek gave me an Arduino Leonardo to play with, one of the first things I did was go through the examples. After almost two decades of experience with PICs, I was amazed at how easy it was to get things like serial communication and ADC working. Don’t get me wrong – I’m still a PIC guy… but I think I’m an Arduino guy now, too.

The ReadAnalogVoltage example caught my attention because it was so simple. Here’s the setup. It’s just a potentiometer with one end terminal connected to +5v and one to GND, and the wiper connected to A0:

Not much to it, eh? That bent yellow wire in the middle just holds the Leonardo in place.

I played around with it for a while and watched the output on the Serial Plotter, but then I had a thought. You need to run the Arduino software to use the Serial Plotter, and besides, the Serial Plotter looks too nice. Half-remembered days of coaxing dusty old VT100 and TN3270 terminals back to life and running a BBS made me think – I could do the same thing, but not as good!

I dug through some of my old PIC programs and found a serial terminal that I wrote back in 2002. Between that and the ANSI sequences at http://ascii-table.com/ansi-escape-sequences.php, I stapled together a really bad looking display that I like to call the World’s Crappiest Oscilloscope, v1. Here’s the program:

// World's crappiest oscilloscope v1
// Borrows heavily from ReadAnalogVoltage example in the Arduino Examples menu
// Uses Arduino Leonardo, reads voltage on analog pin A0, then uses good old ANSI
// codes to
// draw a really bad oscilloscope in a serial terminal.
// A little amusing but very useless.
// Info about ANSI codes is at http://ascii-table.com/ansi-escape-sequences.php

void setup() {

  pinMode(A0, INPUT); // Set pin A0 to input.

  Serial.begin(9600); // initialize serial communication at 9600 bits per second.

  delay(3000);  // Should be enough time to start up a serial terminal.

  // Warm up the tubes...
  Serial.write(27); // Clear terminal screen with ESC [2J and ESC is ASCII 27
  Serial.println("Warming up the tubes, please wait...");
  delay(2000);  // This just here for dramatic effect.
  delay(2000);  // This also just here for dramatic effect.

void loop() {

  // Now set up the fancy oscilloscope screen. Ah, the good old ANSI days...

  // Set Oscilloscope screen to white markers on black background.
  Serial.write(27);  // Clear Terminal screen with (esc)[2J, (esc) is ASCII 27
                    // Serial.write sends binary data to the serial port
  Serial.write(27); // ESC again.
  Serial.print("[H"); // cursor to home.

  // The following lines draw the scale up the left side of the terminal screen and
  // the bottom border.
  Serial.print("0.00V|________________________________________________________________________________\r\n"); // 80x _
  Serial.print("                         WORLD'S CRAPPIEST OSCILLOSCOPE v1\r\n");

  byte ColumnCount = 7; // Okay, there are 50 columns to put data into, starting at column 7 and ending at 57.

  while (ColumnCount <= 87){
    int sensorValue = analogRead(A0); // read the input on analog pin 0. Need to use an int because it's a 10-bit number.
    float voltage = sensorValue * (5.0 / 1023.0); // Convert the reading (which goes from 0 - 1023) to a voltage (0 - 5V).

    int OscOut = (voltage * 4);  // so far so good but need to make it go the other way
    OscOut = 21 - OscOut;

    // now, staple everything together into one string to control the cursor
    // Control cursor position: ESC then [line;columnH
    String OscStr;
    OscStr = '[';
    OscStr = OscStr + OscOut;
    OscStr = OscStr + ';';
    OscStr = OscStr + ColumnCount;
    OscStr = OscStr + 'H';


    // Just for kicks, let's try to change the trace colour to green.



    ColumnCount = ColumnCount +1;
    delay(50); // wait a bit before going back so the screen doesn't fly by too quickly.


I really need to figure out how to widen the blocks in this theme… it kind of mangles the formatting. If you copy and paste it directly, it still works though. This is what it does (don’t start the video unless you have a strong heart – it’s THAT amazing):

I wonder if there actually were any of those old terminals set up with something like this back in the day…