And here I was hoping that this was local inference :)
however, it is really not that impressive for just a client
The reason people were buying a separate Mac minis just to do open claw was 1) security, as it was all vibe coded, so needs to be sandboxed 2) relay iMessage and maybe 3) local inference but pretty slowly. If you don't need to relay iMessage, a raspberry pi could host it on its own device. So if all you need is the pipe, an ESP32 works.
But I have 10-15 ESP32's just waiting for a useful project. Does HN have better suggestions?
a kid-pleaser at the very least