Ed Nisley's Blog: Shop notes, electronics, firmware, machinery, 3D printing, laser cuttery, and curiosities. Contents: 100% human thinking, 0% AI slop.
Make sure you’re running an ad blocker and perhaps a script killer, feed “Larval Engineer received a Pilot InstaBoost” into your favorite search engine, along these lines:
The first (few) hits should be the various ways my original post from late last year appears on wordpress.com, but the rest (particularly from Google) will be spam blogs and scraper sites that ripped my text, ran it past a thesaurus (euphemistically known as article spinning), larded the result with keywords, and reposted the shattered remains. If you click on the links, you’ll have the experience of reading text where short sequences of words make sense, but the overall corpus leaves you shaking your head in disbelief.
Even though Google allegedly doesn’t reward such sites, they make up the bulk of its list. DuckDuckGo does a slightly better job of suppressing them and Bing kills nearly all of the junk, which suggests that Google operates with a powerful incentive to not notice problems in sites serving (its?) advertisements.
There’s obviously no point in getting annoyed with any of the participants…
FWIW, that particular post seems to have drawn the attention of scammers due to the presence of a trademarked brand name with good search-ability. Other posts have been more fortunate in escaping their attention, despite my glowing prose…
The trick depends on specifying the colors with HSB, rather than RGB, so that the buttons in each row have the same hue and differ in saturation and brightness. The Imagemagick incantations look like this:
Disabled: hsb\(${HUE}%,50%,40%\)
Unselected: hsb\(${HUE}%,100%,70%\)
Selected: hsb\(${HUE}%,100%,100%\)
For whatever reason, the hue must be a percentage if the other parameters are also percentages. At least, I couldn’t figure out how to make a plain integer without a percent sign suffix work as a degree value for hue.
Anyhow, in real life they look pretty good and make the selected buttons much more obvious:
The LCD screen looks just like that; I blew out the contrast on the surroundings to provide some context. The green square on the left is the Arduino Mega’s power LED, the purple dot on the right is the heartbeat spot.
The new “needle stop anywhere” symbol (left middle) is the White Draughts Man Unicode character: ⛀ = U+26C0. We call them checkers here in the US, but it’s supposed to look like a bobbin, as you must disengage the handwheel clutch and stop the main shaft when filling a bobbin; the needle positioning code depends on the shaft position sensor.
Weirdly, Unicode has no glyphs for sewing, not even a spool of thread, although “Fish Cake With Swirl” (🍥 = U+1F365) came close. Your browser must have access to a font with deep Unicode support in order to see that one…
Having run off four quick prints with identical settings, I measured the thickness of the skirt threads around each object:
Skirt Thread Consistency
They’re all slightly thicker than the nominal 0.25 mm layer thickness, but centered within ±0.02 mm of the average 0.27 mm. Tweaking the G92 offset in the startup G-Code by 0.02 would fix that.
The 0.29 mm skirt surrounded the first object, which had a truly cold start: 14 °C ambient in the Basement Laboratory. After that, they’re pretty much identical.
Some informal measurements over a few days suggests the actual repeatability might be ±0.05 mm, which is Good Enough for layers around 0.20 to 0.25 mm.
The larger skirt suggests that the platform has a slight tilt, but the caliper resolution is only 0.01 mm.
When I realigned everything after installing the V4 hot end, the last set of thinwall boxes looked like this:
This is mostly a test to see how long it takes before something on the RPi goes toes-up enough to require a manual reboot. Disabling the WiFi link’s power saving mode seems to keep the RPi on the air all the time, which is a start.
I also tried using the camera in its B&W mode to discard the color information up front:
Necklace Heart – circle detail
It’s taken through the macro adapter with the LEDs turned off and obviously benefits from better lighting, with an LED flashlight at grazing incidence. You can even see the Hilbert Curve top infill.
The object of the exercise was to see if those tiny dots would print properly, which they did:
Necklace Heart – dots detail
Now, admittedly, PETG still produces fine hairs, but those dots consist of two layers and two thread widths, so it’s a harsh retraction test.
A look at the other side:
Necklace Heart – detail
All in all, both the object and the pix worked out much better than I expected.
Leaving the camera in full color mode and processing the images in The GIMP means less fiddling with the camera settings, which seems like a net win.
Combining the camera data I collected a while ago with a few hours of screwing around with this old Logitech camera:
Logitech QuickCam for Notebook Plus – front
I’m convinced it’s the worst camera I’d be willing to use in any practical application.
The camera offers these controls:
fswebcam --list-controls
--- Opening /dev/video0...
Trying source module v4l2...
/dev/video0 opened.
No input was specified, using the first.
Available Controls Current Value Range
------------------ ------------- -----
Brightness 128 (50%) 0 - 255
Contrast 128 (50%) 0 - 255
Gamma 4 1 - 6
Exposure 2343 (8%) 781 - 18750
Gain, Automatic True True | False
Power Line Frequency Disabled Disabled | 50 Hz | 60 Hz
Sharpness 2 0 - 3
Adjusting resolution from 384x288 to 320x240.
Putting the non-changing setup data into a fswebcam configuration file:
cat ~/.config/fswebcam.conf
# Logitech QuickCam for Notebook Plus -- 046d:08d8
device v4l2:/dev/video0
input gspca_zc3xx
resolution 320x240
scale 640x480
set sharpness=1
#jpeg 95
set "power line frequency"="60 hz"
Trying to use 640×480 generally produces a Corrupt JPEG data: premature end of data segment error, which looks no better than this and generally much worse:
Logtech 08d8 – 640×480
The top of the picture looks pretty good, with great detail on those dust particles, but at some point the data transfer coughs and wrecks the rest of the image. I could crop the top half to the hipster 16:9 format of 640×360, but the transfer doesn’t always fail that far down the image.
The -R flag that specifies using direct reads instead of mmap, whatever that means, doesn’t help. In fact, the camera generally crashes hard enough to require a power cycle.
Delaying a second with -D1 and / or skipping a frame with -S1 don’t help, either.
The camera works perfectly at 640×480 using fswebcam under Xubuntu 14.04 on a Dell Latitude E6410 laptop, so I’m pretty sure this is a case of the Raspberry Pi being a bit underpowered for the job / the ARM driver taking too long / something totally obscure. A random comment somewhere observed that switching from Raspbian to Arch Linux (the ARM version) solved a similar video camera problem, so there’s surely room for improvement.
Dragorn of Kismet reports that the Raspberry Pi USB hardware doesn’t actually support USB 2.0 data rates, which also produces problems with Ethernet throughput. The comments in that slashdot thread provide enough details: the boat has many holes and it’s not a software problem.
For lack of anything more adventurous, the config file takes a 320×240 image and scales it up to 640×480, which looks about as crappy as you’d expect:
Logtech 08d8 – 320×240 scaled
Even that low resolution will occasionally drop a few bytes along the way, but much less often.
The picture seems a bit blown out, so set the exposure to the absolute minimum:
Given that’s happening a foot under a desk lamp aimed well away from the scene, the other end of the exposure scale around 18000 produces a uselessly burned out image. I think a husky neutral-density filter would be in order for use with my M2’s under-gantry LED panels. The camera seems to be an early design targeting the poorly illuminated Youtube / video chat market segment (I love it when I can talk like that).
There’s probably a quick-and-dirty Imagemagick color correction technique, although Fred’s full-blown autocorrection scripts seem much too heavy-handed for a Raspberry Pi…