User Tools

Site Tools


newsletters:2024-12

December 2024 newsletter

Thanks for following along so far! We've been working on a lot of big projects in 2024 – many still under way (parallel evaluator, gadget design, RFID tag localization, video playback) – but we've gotten a decent amount done this year:

In 2025, we're looking forward to seeing more of these big projects come to fruition, then playing with the resulting new capabilities and applications.

(If you want to stop by next month, our next Folk open house will be in the evening on Wednesday, January 29, at our studio in East Williamsburg, Brooklyn.)

What we've been up to

Working on video support

  • Andrés: I worked on adding video support this month. Reading through ffmpeg API tutorials helped me get a better handle on how to load video dynamically. I have code in a video branch that can display frames from a test video file now, but there's issues with slow playback. Early January 2025 I'll debug these issues and we'll finally have visual playback of video.

Handheld gadget

Omar: I've been working on a new revision of the handheld gadget. Goals:

  • Have a design that we can make more than one of; use the off-the-shelf AnyBeam projector, so people can just buy them (the current design requires an Ultimems dev model that isn't buyable outside Taiwan/Japan)
  • Include a battery: probably use one of the Waveshare UPS models, since only they can supply the 5V5A for a Pi-5-class single-board computer + AnyBeam projector
  • Stretch goals for interactions I want to make: add a trigger button, add a built-in speaker, have 2 cameras so we can have more reliable perception (and maybe not even need tags for depth perception)

Stabilizing the camera

This is some vestigial work on the old-model gadget – last month I printed a revision that attaches the camera to the chassis instead of the front panel (in the hope that the camera would be more stable there, because the front panel flexes and gets hit and stuff). This month, I transferred all the guts of my gadget from the old revision to that new one:

img_5497.jpeg

The hope is that this will have a permanent, stable calibration, since the camera won't move relative to the projector.

20241231-071927.jpeg 20241231-071939.jpeg

It seems improved (and I feel a lot more comfortable handling it), but still not perfect. I think one issue is that the camera module is actually not that well-attached to the PCB, like you can just peel it off and wiggle it, and another issue is with the autofocus that means it can float around? (maybe autofocus is sensitive to orientation, especially pointing down? hmm) The new gadget revision will use different cameras anyway (see below), so punting on this for now.

Trigger button

I've started designing the trigger button as well. I can reuse the outer grip, but I realized I need a whole new inner enclosure for the switches that I have.

After some sanding, the trigger button + inner grip fits inside the outer grip:

img_6149.jpeg

And I figured out the pins on the 3.5mm jack and socket, so I think I just need to test that my Orange Pi can actually pick up button presses now (and then design the next gadget chassis with a corresponding TRRS port to plug into).

New single-board computer

Anyway, the main thing right now is making this new overall design that we can make in bigger quantities. The big problem with the Raspberry Pi 5 is that it lacks support for USB-C video out, so it can't talk to (or power) the retail AnyBeam projector. So we need to do a redesign that uses a different single-board computer.

Radxa Rock 5B: nope

Last month, I was pretty optimistic about the Radxa Rock 5B. It should support USB-C video out, and it has the same pinout as the Raspberry Pi 5, so hats (speaker hat and UPS hat) should work. So I ordered one from China, it showed up after a week or two, and I tested it, and it… did not work; its USB-C port doesn't actually have video out, or it at least can't also power the projector, or something.

img_5981.jpeg

I went on their Discord, nobody had an answer, other posts online were mixed, and this seemed like a giant waste of time where I don't know what's going on, so I gave up on it. Easier to use different hardware than to try to mess with the software (firmware, Linux kernel, device tree stuff, etc).

(This surprised me, since the Orange Pi 5 worked fine with the projector in my tests last month, and the Rock 5B uses the same RK3588 chipset.)

Disappointing – I was hoping we could do this, the kind-of-cool stack of Waveshare UPS B hat + Rock 5B + Raspiaudio hat with built-in speaker:

img_5895.jpeg

Raspberry Pi Compute Module 5: probably not

The Raspberry Pi Compute Module 5 just came out. I think it would be cool (and would save space, save money, and make assembly easier) to make a PCB for the gadget (with onboard ports, speaker, video conversion, etc), and it's nice to use Raspberry Pi stuff since it has a good camera port and software support is good.

But (like the normal Pi 5) the Pi 5 CM lacks USB-C video out support, and I found people struggling with figuring out the conversion from its HDMI out to USB-C, so I decided this is above my pay grade for now.

Back to the Orange Pi 5

Anyway, I already had the Orange Pi 5 working with the projector last month, so why not just use that? (I already had an Orange Pi lying around from a couple of years ago.)

The problems with the Orange Pi were that it doesn't have a usable CSI camera & its pinout is weird (you can't fit it on top of a Waveshare UPS hat, for instance). But these are surmountable problems compared to the projector just not working on the Rock 5B or Raspberry Pi 5. I can use the standalone Waveshare 3S UPS module (which might be more appropriate anyway – it has broken-out power switch and charging port) and (importantly; I didn't consider this possibility before) I can just use a small USB camera.

USB cameras

So I did a small dive into USB cameras. I had been sour on them because I assumed they were all unwieldy, big, webcam-shaped, but there are a lot of decent ones with USB output from ELP and Arducam that are more Pi-camera-shaped (small) and that you can just buy on Amazon or AliExpress.

I realized there is a cool-looking stereo camera with global shutter out there from ELP, and I've wanted to try both things (stereo, global shutter) anyway, so I ordered one and will try it on the gadget.

(I also spent a while digging through USB-A and USB-C cables and adapters to find a short cable with minimal plug footprint, 90-degree angle, etc, since the normal USB cables and plugs are obviously a lot bigger than I'd like to fit inside the chassis, in contrast to the Pi CSI camera's ribbon cable and connector.)

I seem to have all the parts now, so the next steps are system/software testing (does Folk run OK on the Orange Pi 5 these days? does the camera work? does the camera work with the USB 2.0 cables I have lying around?), and putting it in the new (somewhat bigger) chassis design I just printed that fits everything (battery + Orange Pi + projector):

img_6236.jpeg img_6241.jpeg

New parallel evaluator

Omar: Continuing to work on the new, faster evaluator – hoping to create a pull request soon. The current holdup is blinkiness and performance issues, which I'm working on debugging.

Porting /calibrate

There's still a lot of functionality to port even once the evaluator is working. The big ones are calibration and the editor, I think.

Started porting /calibrate (which we'll certainly need if we want folk2 to replace folk1). I'm excited to run folk2 as the sole system on a gadget, and 3D calibration is obviously part of that. Calibration is also a really good end-to-end performance/latency test – you can see the latency right on the board when you're calibrating – so it'll be fun to use that as a metric and squeeze latency as hard as we can after it works.

But this is mostly on hold until we fix the blinking, and then we'll get to it again.

Performance and memory leaks

For most of this month, I've been staring at increasingly elaborate Tracy views of folk2 (and adding more hooks to folk2 to report more info to Tracy, and fixing performance bottlenecks and behavioral bugs and leaks flagged by the Tracy view). Tracy is really, really amazing software.

Now that I've added enough hooks, I get these extremely rich views of all Folk programs/Whens executing (including tag recognition time), frames (when does the GPU frame run? how long does it take? what about the camera frame?), draw count (so we can see when stuff is blinking out and debug that), lock contention, thread behavior, stack profiles for performance analysis, etc.

img_5888.jpeg

Some of the resulting improvements this month:

  • Fixed a ton of memory leaks (mostly freeing clauses and terms in clauses, and some transient objects in C); I think folk2 might be less leaky than folk1 now.
  • Saw that we were spending a lot of time waiting on the one big trie lock, so replaced the trie with an immutable path-copying trie (and an epoch-based reclamation scheme) so readers never have to block and writers only have to retry if someone else changed the trie while they were writing (like a transaction)
  • Created (by patching up memory leaks) and then fixed a problem where we would leak all the statement slots which eventually caused the whole system to blow up
    • img_5829.jpeg
    • “We were not freeing aborted ref which caused us to run out of statement slots which caused statementNew to hang which caused the global workqueue to overflow.”
  • Fixed horrible bug where my steal-half deque operation was wrong (so were dropping lots of operations, the system was not behaving correctly), replaced with more standard steal-one

I'm kind of addicted to using Tracy – it feels like it obsoletes a lot of the monitoring stuff I'd hacked into folk2, because you can just use Tracy to see everything (as long as you add the right hooks). No need for perf probes, or gperftools heap monitoring, or the trace log endpoint. The Tracy hooks are low-overhead (nanoseconds), networking is taken care of for you, and they automatically give you great visualizations that you can zoom and pan and see in context with all other info (which also reduces the need for boiling things down to summary statistics).

And using Tracy feels powerful, UI made for experts with high information density and a high ceiling.

img_5900.jpeg

RFID localization

Omar: I've spent a few days this month on the RFID localization system. The current task is to get networking to work (live reporting info from in-band and out-of-band radios to the PC).

Network thread scheduling

This is kind of tough, because we already don't have that much timing slack on the in-band radio (tens of microseconds) when we respond to the RFID tags to query their IDs. And networking means adding another thread that has to do more work – a TCP server thread in addition to the radio thread and the RFID protocol parser thread. With the network thread on, our hitrate starts to suffer:

So I might need to dig into the Linux scheduling, see what it's actually doing, maybe hard-code some real-time policy stuff, maybe get rid of spin-wait where possible so we can use that time to do networking. The networking is pretty low-priority (we can burn some milliseconds, there's no hard deadline) compared to decoding tag responses, but it still has to get done. I did look into the Linux real-time scheduler and tried turning it on, but it didn't really work.

I feel like it should work, though… like capping the intervals of the threads since you know that the radio thread should only be active for a few tens of microseconds at a time, and the protocol consumer thread should only be active until it runs out of samples, then sleep until new samples come in…

(Even before this, we've been kind of poking at Linux to get stuff to work, locking threads at high priority and putting them on specific CPUs, etc. If I turn that stuff off and rely on default scheduling behavior, we do get worse at talking to the tags, so it's doing something…)

img_5973.jpeg

Rounds

We've also needed to figure out what data structures to send from the in-band radio and the out-of-band radio to the PC, along with the TCP protocol to transmit them (probably also need an interim way of visualizing those live on the PC). (Until now, we've only had the IB radio, and we just have it print when it gets a valid tag ID, and we haven't tried running it indefinitely.)

I made some new structs to represent rounds: mostly need to transmit the locations of bit boundaries from the decoder on the IB radio to the PC. Note that 'locations' is a more complicated concept than you might think; I think they need to be represented as various redundant offsets from various sync points (reset at the beginning, start of round, etc) since the IB and OOB clocks may drift. I made little send/receive functions for those structs (since the data packets are variable-length, we can't just send the structs flat). These functions were really buggy and I spent a lot of time debugging them.

Other Folk system improvements

Friends and outreach

  • We had our monthly open house:
    • img_9478.jpg
    • img_5834.jpeg img_5838.jpeg

What we'll be up to in January

  • Our next Folk open house is in the evening on Wednesday, January 29, at our studio in East Williamsburg, Brooklyn.
  • Finish new Folk gadget design, publish it, prototype some interactions with it
  • Stabilize RFID IB reporting and maybe start OOB & localization
  • Continue on video support
  • Hopefully fix blinking & make a pull request for folk2 evaluator
  • Kosmik work to do integrated phone→table interactions

Omar

Andrés

newsletters/2024-12.txt · Last modified: 2024/12/31 22:24 by osnr

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki