Filament swatches are a good idea. So is labelling them, but I had a lot of trouble getting the label template to line up, and wanted to avoid lots of tedious copy-and-paste around the QR codes. So I made this: (the image is also a link)
Getting the PDF sized correctly was a huge pain, and Pillow let me get close but not exactly, so I ended up using Ghostscript for postprocessing. It was a mess getting there, but so far it seems stable? I guess we’ll see. The results look good though!
I made a 10 Ah solar powered charger! It uses an Adafruit solar/DC battery charger board and a 5V booster. The case is 3D-printed in two parts – the lower shell and the lid. As is visible, the lid screws down into the shell. Both boards that things plug into are screwed down as well for better stability. My Apple devices wouldn’t charge from it at first, and I learned that they need specific voltages on the data lines (or, presumably, USB PD, but that’s way harder) to determine how many mA is safe to draw.
I’ve been keeping it by my south-facing window plugged into a 20W solar panel and using it to charge my devices, but there are cheaper still-6V options if lower wattages work for your use case.
If you’re interested – more details:
I needed to add a thermistor to the battery to safely charge at higher than 1A, which seemed especially important given that I wanted to output 1A and the solar board will not draw more than the maximum charge rate from its sources. Then I was able to cut the 1.0A selector and solder the 1.5A, as is visible on the board.
The resistors you can barely see in the middle upper part of the board was for an ill-fated LED to show whether it was charging. It was too bright, broke off, and was redundant when you can look in through the gap around the USB in and see the LEDs on the solar board anyway if you really want to. The existing red charging LED even makes the lid glow in dim enough light. I had made holes for what was going to be external counterparts to the red and green status LEDs and that was very, very unnecessary.
Early on aligning the holes correctly was difficult, so for more than one case I ended up ripping the wall apart until things were situated correctly. The data line voltages I made with voltage dividers and this handy Adafruit diagram:
Lacking those exact resistor values, I found values with the same ratios that were close enough: 100k on the D+ line, 47k + 10k on the D-, and after two 47k + 22k.
JST XH header and plug for the thermistor (and tape to tape it to the battery)
three M2.5×5 (the solar board can accommodate up to 10mm length; I don’t remember why I did that)
M3x35, and an M3x10 if you want the breadboard to be very secure, though in my experience screwing down the boards is more than enough
the resistors
The breakdown totals to $162.85, so let’s say you can build this solar powered charger box thing for maybe $170 or so! (Depending on how much of the incidentals you already have of course.) It’s the opposite of a deal, but it’s fun! The battery is held in place with sheer tightness of fit and yet does not rattle, which I’m proud of. It does mean that you need some kind of hook to coax it up and out, which I’ve been using curved tweezers for (with the soft plastic tip cover left on), and that is preferable to adhesive in my book. The files for the case:
My mower boundary wire got a break in it this spring, and unlike previous times, I didn’t know why. So now I had to check the perimeter of my lawn for damage, and in most cases the wire is underground or at least undergrass. Turns out there’s a better way!
It took shoving a LONG screwdriver into the ground, but I was able to use a wire toner tracer between the screwdriver and one of the wire ends. It wasn’t audible very far from the wire itself, unfortunately, so finding the wire could be a challenge. The break turned out to be the neighbor’s lawn aeration hitting the line. Wire strippers and waterproof wire nuts made it a quick fix.
That’s what it says on the case, anyway. (Thanks to Hannah for the silly name.)
This has been a hobby project for about a year! It’s a compact audio player with a clickable control knob, display, and a volume knob on the back. Surface transducers on the bottom allow it – depending heavily on the surface it’s on – to sound louder (and ideally even higher quality) than one might expect for its size and hobby project status.
You can charge it and access its internal MicroSD card over USB, and also charge it with any 5-10V 2.1mm DC plug – including a solar panel!
I wrote the control software in PlatformIO flavored Arduino, and designed, modeled, and 3D printed the case. This meant learning – and subsequently outgrowing – TinkerCad, and then moving to Fusion 360. The case has 3 pieces, and they print (almost) entirely without supports: body, back, and bottom.
Behavior
It reads tags from files on the MicroSD card, sends commands and audio data to the VS1053 on the Music Maker board, renders text, and writes it to the display.
Hand-soldered FeatherWing Proto in the back for a USB storage mode toggle, power handling, and volume potentiometer connection. (This part hasn’t stabilized yet.)
After Ars Technica ran this article on a Stable Diffusion mobile app, it seemed like a good time to give Stable Diffusion another shot. I had previously given up figuring out how to set up the desktop version. It’s polished! It includes example prompts to demonstrate what sorts of incantations make up a good prompt, and with that and a moderate wait for it to generate I had this:
That was enough to hook me, so when I noticed that article also linked to stable-diffusion-webui, it became a great time to see what the same underlying image generation can do when it can draw ~300W continuously on my desktop instead of being limited to a phone’s resources. I quickly (and somewhat inadvertently) was able to generate a cat fractal:
cute ((cat)) with a bow, studio photo, soft lighting, 4k
This was my introduction to the sorts of artifacts I could expect to do battle with. Then I had an idea of how to use its ability to modify existing photos. After some finagling, I had a prompt ready, set it to replace the view outside the window, and left it running. When set to a very high level of detail and output resolution, it generated 92 images over about 6 hours. Of those, 22 seemed pretty good. Here is a comparison of my 2 favorites:
And a slightly different prompt where I had selected part of the window other than the glass:
a colorful photo of the circular wooden door to a hobbit hole in the middle of a forest with trees and (((bushes))), by Ismail Inceoglu, ((((shadows)))), ((((high contrast)))), dynamic shading, ((hdr)), detailed vegetation, digital painting, digital drawing, detailed painting, a detailed digital painting, gothic art, featured on deviantart
There were three sorts of artifacts or undesirable outputs that were frequent:
It focused too much on the “circular” part describing a hobbit hole door.
It added unsettling Hobbit-cryptids that brought to mind Loab.
The way I used the built-in inpainting to replace the outside meant both that the image was awkwardly separated into 3 largely independent areas, and those images often tried to merge with the area around the window. This makes total sense for its actual use case of editing an image, but I’d tried to configure it to ignore the existing image without success. In retrospect, I could have used the mask I made to manually put the images behind the window with conventional editing software.
It’s wild to use image generation to generate this without being able to even imagine painting it myself. It’s definitely not the system doing all the work – you have to come up with the right kind of prompt, adjust generation parameters, and curate the results. But it’s a lot cheaper and easier than art school and practice, which I feel uncomfortable about because this model was trained in part and without permission on art from those who did go to art school.
I had figured it wouldn’t happen once Ghost Ship Games said they weren’t going to tackle it, but it turns out it’s viable to mod in! It’s in open beta currently, and there are rough edges throughout the experience, but the core of it – being in a huge cave and fighting bugs – is just as incredible as I had hoped it might be. It lets VR do what it’s good at – intensify already-good experiences. A far cry from the stereoscopic screenshots NVIDIA’s Ansel provided! I’m very pleased.
Here’s the problem – my neighbor’s HVAC closet is doing this, and it’s preventing me from sleeping:
Audacity spectrogram with primary component around 55 HzThis is the highlighted section
I’ve submitted a maintenance request, sure, but more data more better, right? There are three states here: not buzzing, buzzing, and buzzing without the higher component. 55 Hz primary, 220 Hz higher component. FFTs away!
I was hoping this would be the part where I link to the code, but boy howdy is it harder than I anticipated to get a microcontroller to do this.
Not for computers – for other electronics. I’d glossed over the “it can supply 600mA peak” part for my microcontroller board’s 3.3V regulator, and assumed I wouldn’t hit it. Then I spent a great deal of time trying to diagnose strange nondeterministic behavior. The OLED display would quickly go blank. The SCD40 CO2 sensor would quickly stop providing data. These problems all immediately stopped when I used an external power supply capable of higher amperage. Go figure.