The Smell of Molten Projects in the Morning

Ed Nisley's Blog: Shop notes, electronics, firmware, machinery, 3D printing, laser cuttery, and curiosities. Contents: 100% human thinking, 0% AI slop.

Category: Photography & Images

Taking & making images.

  • Arducam Motorized Focus Camera Control

    Arducam Motorized Focus Camera Control

    Despite the company name name, the Arducam 5 MP Motorized Focus camera plugs into a Raspberry Pi’s camera connector and lives on a PCB the same size as ordinary RPi cameras:

    Arducam Motorized Focus RPi Camera - test overview
    Arducam Motorized Focus RPi Camera – test overview

    That’s a focus test setup to get some idea of how the control values match up against actual distances.

    It powers up focused at infinity (or maybe a bit beyond):

    Arducam Motorized Focus RPi Camera - default focus
    Arducam Motorized Focus RPi Camera – default focus

    In practice, it’s a usable, if a bit soft, at any distance beyond a couple of meters.

    The closest focus is around 40 mm, depending on where you set the ruler’s zero point:

    Arducam Motorized Focus RPi Camera - near focus
    Arducam Motorized Focus RPi Camera – near focus

    That’s the back side of the RPi V1 camera PCB most recently seen atop the mystery microscope objective illuminator.

    Pondering the sample code shows the camera focus setting involves writing two bytes to an I²C address through the video controller’s I²C bus. Enable that bus with a line in /boot/config.txt:

    dtparam=i2c_vc=on

    If you’re planning to capture 1280×720 or larger still images, reserve enough memory in the GPU:

    gpu_mem=512

    I don’t know how to determine the correct value.

    And, if user pi isn’t in group i2c, make it so, then reboot.

    The camera must be running before you can focus it, so run raspivid and watch the picture. I think you must do that in order to focus a (higher-res) still picture, perhaps starting a video preroll (not that kind) in a different thread while you fire off a (predetermined?) focus value, allow time for the lens to settle, then acquire a still picture with the video still running.

    The focus value is number between 0 and 1023, in two bytes divided, written in big-endian order to address 0x0c on bus 0:

    i2cset -y 0 0x0c 0x3f 0xff

    You can, of course, use decimal numbers:

    i2cset -y 0 0x0c 63 255

    I think hex values are easier to tweak by hand.

    Some tinkering gives this rough correlation:

    Focus value (hex)Focus distance (mm)
    3FFF45 (-ish)
    300055
    200095
    1000530
    0800850
    Arducam Motorized Focus Camera – numeric value vs mm

    Beyond a meter, the somewhat gritty camera resolution gets in the way of precise focusing, particularly in low indoor lighting.

    A successful write produces a return code of 0. Sometimes the write will inexplicably fail with an Error: Write failed message, a return code of 1, and no focus change, so it’s Good Practice to retry until it works.

    This obviously calls for a knob and a persistent value!

  • Raspberry Pi HQ Camera + 16 MM 10 MP Lens: Depth of Field

    Raspberry Pi HQ Camera + 16 MM 10 MP Lens: Depth of Field

    Part of the motivation for getting a Raspberry Pi HQ camera sensor was being able to use lenses with adjustable focus and aperture, like the Official 10 MP “telephoto” lens:

    RPi HQ Camera - aperture demo setup
    RPi HQ Camera – aperture demo setup

    Yes, it can focus absurdly close to the lens, particularly when you mess around with the back focus adjustment.

    With the aperture fully open at f/1.4:

    RPi HQ Camera - aperture demo - f 1.4
    RPi HQ Camera – aperture demo – f 1.4

    Stopped down to f/16:

    RPi HQ Camera - aperture demo - f 16
    RPi HQ Camera – aperture demo – f 16

    The field of view is about 60 mm (left-to-right) at 150 mm. Obviously, arranging the camera with its optical axis more-or-less perpendicular to the page will improve everything about the image.

    For normal subjects at normal ranges with normal lighting, the depth of field works pretty much the way you’d expect:

    At f/1.4, focused on the potted plants a dozen feet away:

    Raspberry Pi HQ Camera - outdoor near focus
    Raspberry Pi HQ Camera – outdoor near focus

    Also at f/1.4, focused on the background at infinity:

    Raspberry Pi HQ Camera - outdoor far focus
    Raspberry Pi HQ Camera – outdoor far focus

    In comparison, the laptop camera renders everything equally badly (at a lower resolution, so it’s not a fair comparison):

    Laptop camera - outdoors
    Laptop camera – outdoors

    Stipulated: these are screen captures of “movies” from raspivid over the network to VLC. The HQ sensor is capable of much better images.

    None of this is surprising, but it’s a relief from the usual phone sensor camera with fixed focus (at “infinity” if you’re lucky) and a wide-open aperture.

  • Monthly Science: Burnett Signal Timing

    Monthly Science: Burnett Signal Timing

    The NYS DOT has been improving the pedestrian crossings at the Burnett – Rt 55 intersection. I expect this will be a bullet item in their Complete Streets compliance document, with favorable job reviews for all parties. The situation for bicyclists using the intersection, which provides the only access from Poughkeepsie to the Dutchess Rail Trail, hasn’t changed in the slightest. No signal timing adjustments, no bike-capable sensor loops, no lane markings, no shoulders, no nothing.

    Here’s what NYS DOT’s Complete Streets program looks like from our perspective, with the four-digit frame numbers ticking along at 60 frame/sec.

    We’re waiting on Overocker Rd for Burnett traffic to clear enough to cross three lanes from a cold start:

    Burnett Signal - 2020-09-25 - front 0006
    Burnett Signal – 2020-09-25 – front 0006

    That building over there across Burnett is the NYS DOT Region 8 Headquarters, so we’re not in the hinterlands where nobody ever goes.

    We’re rolling:

    Burnett Signal - 2020-09-25 - front 0258
    Burnett Signal – 2020-09-25 – front 0258

    The Burnett signals just turned green, although the cars haven’t started moving yet, and we’re accelerating out of Overocker:

    Burnett Signal - 2020-09-25 - front 0463
    Burnett Signal – 2020-09-25 – front 0463

    About 1.5 seconds later, the vehicles have started moving and we’re lining up for the left side of the right-hand lane:

    Burnett Signal - 2020-09-25 - front 0752
    Burnett Signal – 2020-09-25 – front 0752

    There’s no traffic behind us, so we can ride a little more to the right than we usually do, in the hopes of triggering the signal’s unmarked sensor loop:

    Burnett Signal - 2020-09-25 - front 1178
    Burnett Signal – 2020-09-25 – front 1178

    We didn’t expect anything different:

    Burnett Signal - 2020-09-25 - front 1333
    Burnett Signal – 2020-09-25 – front 1333

    We’re rolling at about 12 mph and it’s unreasonable to expect us to jam to a stop whenever the signal turns yellow. Oh, did you notice the truck parked in the sidewalk over on the left?

    As usual, 4.3 seconds later, the Burnett signals turn red, so we’re now riding in the “intersection clearing” delay:

    Burnett Signal - 2020-09-25 - front 1593
    Burnett Signal – 2020-09-25 – front 1593

    Two seconds later, the Rt 55 signals turn green:

    Burnett Signal - 2020-09-25 - front 1711
    Burnett Signal – 2020-09-25 – front 1711

    Did you notice all three eastbound lanes of Rt 55 (on our right) were occupied? That means a driver can’t come zipping through without stopping at the green light in their direction.

    One second later, we’re still proceeding through the intersection, clearing the lethally smooth manhole cover by a few inches, and approaching the far side:

    Burnett Signal - 2020-09-25 - front 1771
    Burnett Signal – 2020-09-25 – front 1771

    Here’s what the intersection looks like behind me:

    Burnett Signal - 2020-09-25 - rear 1
    Burnett Signal – 2020-09-25 – rear 1

    Another second goes by and we’re pretty much into the far right lane , with the westbound traffic beginning to move:

    Burnett Signal - 2020-09-25 - front 1831
    Burnett Signal – 2020-09-25 – front 1831

    The pedestrian crossing ladder has fresh new paint. They milled off the old paint while reconstructing the crossing, so the scarred asphalt will deteriorate into potholes after a few freeze-thaw cycles. Not their problem, it seems.

    Although it’s been three seconds since Rt 55 got a green signal, the eastbound drivers remain stunned by our presence:

    Burnett Signal - 2020-09-25 - rear 2
    Burnett Signal – 2020-09-25 – rear 2

    After another second, we’re almost where we need to be:

    Burnett Signal - 2020-09-25 - front 1891
    Burnett Signal – 2020-09-25 – front 1891

    There’s a new concrete sidewalk on the right, with a wheelchair-accessible signal button I can now hit with my elbow when we’re headed in the other direction. It’s worth noting there is no way to reach Overocker by bicycle, other than riding the sidewalk; there’s only one “complete” direction for vehicular cyclists.

    One second later puts us as far to the right as we can get, given all the gravel / debris / deteriorated asphalt along the fog line near the curb:

    Burnett Signal - 2020-09-25 - front 1957
    Burnett Signal – 2020-09-25 – front 1957

    Which is good, because four seconds after the green signal for Rt 55, the pack has overtaken us:

    Burnett Signal - 2020-09-25 - rear 3
    Burnett Signal – 2020-09-25 – rear 3

    If you were the driver of the grayish car in the middle lane, directly behind the black one giving us plenty of room, you might be surprised at the abrupt lane change in front of you. Maybe not, because you had a front-row seat while we went through the intersection.

    Elapsed time from the green signal on Burnett: 25 seconds. My point is that another few seconds of all-red intersection clearing time wouldn’t materially affect anybody’s day and would go a long way toward improving bicycle safety.

    Unlike the pedestrian crossing upgrade, NYS DOT could fix this with zero capital expenditure: one engineer with keys to the control box, a screwdriver or keyboard (depending on the age of the controls), and the ability to do the right thing could fix it before lunch tomorrow.

    But it’s just a typical bike ride on NYS DOT’s Complete Streets, where their planners & designers claim to “promote pedestrian and bicycle travel for all persons.” Maybe that’s true somewhere in NYS DOT’s fantasies, but you’ll find far more evidence from our rides, with plenty of numbers, showing that’s not the case around here.

  • Mystery Microscope Objective Illuminator

    Mystery Microscope Objective Illuminator

    Rummaging through the Big Box o’ Optics in search of something else produced this doodad:

    Microscope objective illuminator - overview
    Microscope objective illuminator – overview

    It carries no brand name or identifier, suggesting it was shop-made for a very specific and completely unknown purpose. The 5× objective also came from the BBo’O, but wasn’t related in any way other than fitting the threads, so the original purpose probably didn’t include it.

    The little bulb fit into a cute and obviously heat-stressed socket:

    Microscope objective illuminator - bulb detail
    Microscope objective illuminator – bulb detail

    The filament was, of course, broken, so I dismantled the socket and conjured a quick-n-dirty white LED that appears blue under the warm-white bench lighting:

    Microscope objective illuminator - white LED
    Microscope objective illuminator – white LED

    The socket fits into the housing on the left, which screws onto a fitting I would have sworn was glued / frozen in place. Eventually, I found a slotted grub screw hidden under a glob of dirt:

    Microscope objective illuminator - lock screw
    Microscope objective illuminator – lock screw

    Releasing the screw let the fitting slide right out:

    Microscope objective illuminator - lamp reflector
    Microscope objective illuminator – lamp reflector

    The glass reflector sits at 45° to direct the light coaxially down into the objective (or whatever optics it was originally intended for), with the other end of the widget having a clear view straight through. I cleaned the usual collection of fuzz & dirt off the glass, then centered and aligned the reflection with the objective.

    Unfortunately, the objective lens lacks antireflection coatings:

    Microscope objective illuminator - stray light
    Microscope objective illuminator – stray light

    The LED tube is off to the right at 2 o’clock, with the bar across the reflector coming from stray light bouncing back from the far wall of the interior. The brilliant dot in the middle comes from light reflected off the various surfaces inside the objective.

    An unimpeachable source tells me microscope objectives are designed to form a real image 180 mm up inside the ‘scope tube with the lens at the design height above the object. I have the luxury of being able to ignore all that, so I perched a lensless Raspberry Pi V1 camera on a short brass tube and affixed it to a three-axis positioner:

    Microscope objective illuminator - RPi camera lashup
    Microscope objective illuminator – RPi camera lashup

    A closer look at the lashup reveals the utter crudity:

    Microscope objective illuminator - RPi camera lashup - detail
    Microscope objective illuminator – RPi camera lashup – detail

    It’s better than I expected:

    Microscope objective illuminator - RPi V1 camera image - unprocessed
    Microscope objective illuminator – RPi V1 camera image – unprocessed

    What you’re seeing is the real image formed by the objective lens directly on the RPi V1 camera’s sensor: in effect, the objective replaces the itsy-bitsy camera lens. It’s a screen capture from VLC using V4L2 loopback trickery.

    Those are 0.1 inch squares printed on the paper, so the view is about 150×110 mil. Positioning the camera further from the objective would reduce both the view (increase the magnification) and the amount of light, so this may be about as good as it get.

    The image started out with low contrast from all the stray light, but can be coerced into usability:

    Microscope objective illuminator - RPi V1 camera image - auto-level adjust
    Microscope objective illuminator – RPi V1 camera image – auto-level adjust

    The weird violet-to-greenish color shading apparently comes from the lens shading correction matrix baked into the RPi image capture pipeline and can, with some difficulty, be fixed if you have a mind to do so.

    All this is likely not worth the effort given the results of just perching a Pixel 3a atop the stereo zoom microscope:

    Pixel 3a on stereo zoom microscope
    Pixel 3a on stereo zoom microscope

    But I just had to try it out.

  • Raspberry Pi HQ Camera Mount

    Raspberry Pi HQ Camera Mount

    As far as I can tell, Raspberry Pi cases are a solved problem, so 3D printing an intricate widget to stick a Pi on the back of an HQ camera seems unnecessary unless you really, really like solid modeling, which, admittedly, can be a thing. All you really need is a simple adapter between the camera PCB and the case of your choice:

    HQ Camera Backplate - OpenSCAD model
    HQ Camera Backplate – OpenSCAD model

    A quartet of 6 mm M2.5 nylon spacers mount the adapter to the camera PCB:

    RPi HQ Camera - nylon standoffs
    RPi HQ Camera – nylon standoffs

    The plate has recesses to put the screw heads below the surface. I used nylon screws, but it doesn’t really matter.

    The case has all the right openings, slots in the bottom for a pair of screws, and costs six bucks. A pair of M3 brass inserts epoxied into the plate capture the screws:

    RPi HQ Camera - case adapter plate - screws
    RPi HQ Camera – case adapter plate – screws

    Thick washers punched from an old credit card go under the screws to compensate for the case’s silicone bump feet. I suppose Doing the Right Thing would involve 3D printed spacers matching the cross-shaped case cutouts.

    Not everyone agrees with my choice of retina-burn orange PETG:

    RPi HQ Camera - 16 mm lens - case adapter plate
    RPi HQ Camera – 16 mm lens – case adapter plate

    Yes, that’s a C-mount TV lens lurking in the background, about which more later.

    The OpenSCAD source code as a GitHub Gist:

    // Raspberry Pi HQ Camera Backplate
    // Ed Nisley KE4ZNU 2020-09
    //– Extrusion parameters
    /* [Hidden] */
    ThreadThick = 0.25;
    ThreadWidth = 0.40;
    HoleWindage = 0.2;
    function IntegerMultiple(Size,Unit) = Unit * ceil(Size / Unit);
    function IntegerLessMultiple(Size,Unit) = Unit * floor(Size / Unit);
    Protrusion = 0.1; // make holes end cleanly
    inch = 25.4;
    ID = 0;
    OD = 1;
    LENGTH = 2;
    //- Basic dimensions
    CamPCB = [39.0,39.0,1.5]; // Overall PCB size, plus a bit
    CornerRound = 3.0; // … has rounded corners
    CamScrewOC = [30.0,30.0,0]; // … mounting screw layout
    CamScrew = [2.5,5.0,2.2]; // … LENGTH = head thickness
    Standoff = [2.5,5.5,6.0]; // nylon standoffs
    Insert = [3.0,4.0,4.0];
    WallThick = IntegerMultiple(2.0,ThreadWidth);
    PlateThick = Insert[LENGTH];
    CamBox = [CamPCB.x + 2*WallThick,
    CamPCB.y + 2*WallThick,
    Standoff.z + PlateThick + CamPCB.z + 1.0];
    PiPlate = [90.0,60.0,PlateThick];
    PiPlateOffset = [0.0,(PiPlate.y – CamBox.y)/2,0];
    PiSlotOC = [0.0,40.0];
    PiSlotOffset = [3.5,3.5];
    NumSides = 2*3*4;
    TextDepth = 2*ThreadThick;
    //———————-
    // Useful routines
    module PolyCyl(Dia,Height,ForceSides=0) { // based on nophead's polyholes
    Sides = (ForceSides != 0) ? ForceSides : (ceil(Dia) + 2);
    FixDia = Dia / cos(180/Sides);
    cylinder(r=(FixDia + HoleWindage)/2,h=Height,$fn=Sides);
    }
    //———————-
    // Build it
    difference() {
    union() {
    hull() // camera enclosure
    for (i=[-1,1], j=[-1,1])
    translate([i*(CamBox.x/2 – CornerRound),j*(CamBox.y/2 – CornerRound),0])
    cylinder(r=CornerRound,h=CamBox.z,$fn=NumSides);
    translate(PiPlateOffset)
    hull()
    for (i=[-1,1], j=[-1,1]) // Pi case plate
    translate([i*(PiPlate.x/2 – CornerRound),j*(PiPlate.y/2 – CornerRound),0])
    cylinder(r=CornerRound,h=PiPlate.z,$fn=NumSides);
    }
    hull() // camera PCB space
    for (i=[-1,1], j=[-1,1])
    translate([i*(CamPCB.x/2 – CornerRound),j*(CamPCB.y/2 – CornerRound),PlateThick])
    cylinder(r=CornerRound,h=CamBox.z,$fn=NumSides);
    translate([0,-CamBox.y/2,PlateThick + CamBox.z/2])
    cube([CamScrewOC.x – Standoff[OD],CamBox.y,CamBox.z],center=true);
    for (i=[-1,1], j=[-1,1]) // camera screws with head recesses
    translate([i*CamScrewOC.x/2,j*CamScrewOC.y/2,-Protrusion]) {
    PolyCyl(CamScrew[ID],2*CamBox.z,6);
    PolyCyl(CamScrew[OD],CamScrew[LENGTH] + Protrusion,6);
    }
    for (j=[-1,1]) // Pi case screw inserts
    translate([0,j*PiSlotOC.y/2 + PiSlotOffset.y,-Protrusion] + PiPlateOffset)
    PolyCyl(Insert[OD],2*PiPlate.z,6);
    translate([-PiPlate.x/2 + (PiPlate.x – CamBox.x)/4,0,PlateThick – TextDepth/2] + PiPlateOffset)
    cube([15.0,30.0,TextDepth + Protrusion],center=true);
    }
    translate([-PiPlate.x/2 + (PiPlate.x – CamBox.x)/4 + 3,0,PlateThick – TextDepth – Protrusion] + PiPlateOffset)
    linear_extrude(height=TextDepth + Protrusion,convexity=2)
    rotate(-90)
    text("Ed Nisley",font="Arial:style=Bold",halign="center",valign="center",size=4,spacing=1.05);
    translate([-PiPlate.x/2 + (PiPlate.x – CamBox.x)/4 – 3,0,PlateThick – TextDepth – Protrusion] + PiPlateOffset)
    linear_extrude(height=TextDepth + Protrusion,convexity=2)
    rotate(-90)
    text("KE4ZNU",font="Arial:style=Bold",halign="center",valign="center",size=4,spacing=1.05);
  • Raspberry Pi Streaming Video Loopback

    Raspberry Pi Streaming Video Loopback

    As part of spiffing my video presence for SquidWrench Zoom meetings, I put a knockoff RPi V1 camera into an Az-El mount, stuck it to a Raspberry Pi, installed the latest OS Formerly Known as Raspbian, did a little setup, and perched it on the I-beam over the workbench:

    Raspberry Pi - workbench camera setup
    Raspberry Pi – workbench camera setup

    The toothbrush head has a convenient pair of neodymium magnets affixing the RPi’s power cable to the beam, thereby preventing the whole lashup from falling off. The Pi, being an old Model B V 1.1, lacks onboard WiFi and requires a USB WiFi dongle. The white button at the lower right of the heatsink properly shuts the OS down and starts it up again.

    Zoom can show video only from video devices / cameras attached to the laptop, so the trick is to make video from the RPi look like it’s coming from a local laptop device.

    Start by exporting video from the Raspberry Pi:

    raspivid --nopreview -t 0 -rot 180 -awb sun --sharpness -50 --flicker 60hz -w 1920 -h 1080 -ae 48 -a 1032 -a 'RPi Cam1 %Y-%m-%d %X'  -b 1000000 -l -o tcp://0.0.0.0:5000

    The -rot 180 -awb sun --sharpness -50 --flicker 60hz parameters make the picture look better. The bottom of the video image There is no way to predict which side of the video will be on the same side as the cable, if that’s any help figuring out which end is up, and the 6500 K LED tubes apparently fill the shop with “sun”.

    The -l parameter causes raspivid to wait until it gets an incoming tcp connection on port 5000 from any other IP address, whereupon it begins capturing video and sending it out.

    Then, on the laptop, create a V4L loopback device:

    sudo modprobe v4l2loopback devices=1 video_nr=10 exclusive_caps=1 card_label="Workbench"

    Zoom will then include a video source identified as “Workbench” in its list of cameras.

    Now fetch video from the RPi and ram it into the loopback device:

    ffmpeg -f h264 -i tcp://192.168.1.50:5000 -f v4l2 -pix_fmt yuv420p /dev/video10

    VLC knows it as /dev/video10:

    RPi - V4L loopback - screen grab
    RPi – V4L loopback – screen grab

    That’s the edge of the workbench over there on the left, looking distinctly like a cliff.

    The RPi will happily stream video all day long to ffmpeg while you start / stop the display program pulling the bits from the video device. However, killing ffmpeg also kills raspivid, requiring a manual restart of both programs. This isn’t a dealbreaker for my simple needs, but it makes unattended streaming from, say, a yard camera somewhat tricky.

    There appear to be an infinite number of variations on this theme, not all of which work, and some of which rest upon an unsteady ziggurat of sketchy / unmaintained software.

    Addendum: If you have a couple of RPi cameras, it’s handy to run the matching ssh and ffmpeg sessions in screen / tmux / whatever terminal multiplexer you prefer. I find it easier to flip through those sessions with Ctrl-A N, rather than manage half a dozen tabs in a single terminal window. Your mileage may differ.

  • Monthly Image: Mantis Mating

    Monthly Image: Mantis Mating

    The Praying Mantis in the Butterfly Bush is definitely female:

    Praying Mantis Mating - front
    Praying Mantis Mating – front

    I’d noticed her distended abdomen a day or two earlier, when it was highlighted in the sun and pulsing slowly. The indentations under the male’s legs shows the surface is definitely softer than the hard chitin of most insect armor:

    Praying Mantis Mating - rear
    Praying Mantis Mating – rear

    The tip of the male’s abdomen twisted around to make contact, but I have no idea what all the little doodads common to both of them back there were doing.

    The whole process started in mid-afternoon, they were still locked together six hours later, and the male was gone in the morning. The stories about female mantises eating the males seem greatly exaggerated, but she did manage to catch and eat a moth while otherwise engaged.

    We’ll keep watch for ootheca on the tall grasses again, although we’ll never know the rest of their story.