Homelab close-up. Structure is wood, front panels 3d printed. Labour of love.
Wife and kids have been away for a week, so this is what I'm working on to keep my mind busy.
18.02.2026 17:36 — 👍 1 🔁 0 💬 0 📌 0Homelab close-up. Structure is wood, front panels 3d printed. Labour of love.
Wife and kids have been away for a week, so this is what I'm working on to keep my mind busy.
18.02.2026 17:36 — 👍 1 🔁 0 💬 0 📌 0Then why did he proceed to eat the entire quartet of French gold medalists?
18.02.2026 15:52 — 👍 3 🔁 0 💬 1 📌 0Cats are cold-blooded killers of wildlife. They are, in fact, an invasive species on almost every continent. Please try to keep them indoors.
09.02.2026 18:23 — 👍 0 🔁 0 💬 1 📌 0All I want is an AI that can tell me where I put my bloody pencil.
09.02.2026 14:02 — 👍 0 🔁 0 💬 0 📌 0Spec-driven development gives me better code than when I write the code manually. By starting from the spec each time, I'm forced to think architecturally from the beginning, and edge-cases are handled from the outset. I'm writing less and less code, but actually enjoying the process more.
20.01.2026 10:29 — 👍 4 🔁 0 💬 1 📌 0
Also, I've defaulted to Mistral initially as it's a good privacy option, but will be moving towards a hybrid model based heavily on Ollama, but with smaller models so that it can be run on small machines.
Hosted LLMs will be used, but sparingly, and possibly with some obfuscation.
I'm building a privacy focussed AI stack for family use. It's based on Open WebUI - which is ludicrously cool - and will use postgres with pgvector for memories.
Not much more than a skeleton at the moment: github.com/andyjessop/p...
I feel like the general point still stands. If a machine gives you a higher level of abstraction, it will likely make the old way obsolete.
15.01.2026 10:41 — 👍 2 🔁 0 💬 1 📌 0But we do have machines that churn out textiles? People don't weave cotton by hand.
15.01.2026 10:40 — 👍 0 🔁 0 💬 2 📌 0
I believe it depends on how good the LLM is (or how good it gets). If it genuinely provides a new layer of abstraction, then there may be no need to ever touch code again. We all just become system designers.
I'm sure hole punchers also felt attachment to their craft.
A bash script launched as a service measures the battery level and reports it to mqtt. I have an automation in node-red (oh, I forgot I also installed node-red) that will listen to the battery status and turn off the plug if it gets over 50%, and turn on again when it gets under 40%.
13.01.2026 16:36 — 👍 1 🔁 0 💬 0 📌 0I think I've created the homelab version of a "useless machine". I got an old laptop, booted with proxmox, installed mosquitto, telegraf, grafana, and zigbee2mqtt, got myself a smart plug, and now I can automatically manage the battery level of the same laptop so that the battery lasts.
13.01.2026 16:34 — 👍 2 🔁 0 💬 1 📌 0I'm quite remote, but luckily France have been exceptional at rolling out fibre and nearly every household has it now.
13.01.2026 16:32 — 👍 1 🔁 0 💬 0 📌 0Finally managed to get rid of Starlink.
13.01.2026 15:01 — 👍 1 🔁 0 💬 1 📌 0Oh christ lol
12.01.2026 15:53 — 👍 0 🔁 0 💬 0 📌 0This is why I commit after every change when using agents.
12.01.2026 15:29 — 👍 2 🔁 0 💬 1 📌 0Nice one, thanks!
12.01.2026 12:24 — 👍 0 🔁 0 💬 0 📌 0Has anyone built git hosting with atproto yet?
12.01.2026 09:49 — 👍 1 🔁 0 💬 1 📌 0All running on a 2015 MBP (16GB)
11.01.2026 16:18 — 👍 1 🔁 0 💬 0 📌 0
Proxmox
Mosquitto
zigbee2mqtt
Home assistant
Telegraf
InfluxDb
Grafana
Happy with this for now. Will monitor for a while and think about improvements later.
Just installed Proxmox on an old Macbook for my new home lab. How long do I have until my wife initiates divorce proceedings?
11.01.2026 12:56 — 👍 0 🔁 0 💬 0 📌 0Me too - I was on starlink until today!
08.01.2026 20:21 — 👍 2 🔁 0 💬 0 📌 0
Just got fibre to the house. It's ridiculous that I can get 3+ gbps here in remotest France. They've really done so well with the rollout.
France is 5th in the world for average Internet speeds. Chile is the only decently sized country above us.
Even if you don't want to vibe code it, I suggest talking it over with an LLM. It will help you crystallise what you want and what it will take.
06.01.2026 22:51 — 👍 1 🔁 0 💬 0 📌 0
The goal isn't raw performance (bundle size, etc). The goal is velocity and maintainability.
Most vibe-coded projects hit a wall when the code gets too complex for the LLM to understand. The framework is an architecture designed to keep the "context" clear, so LLMs can keep building correctly.
To make it practical, the framework uses a "Trace Middleware."
It logs every action and state change to a JSON file. If a test fails, the AI ingests this log file to replay the event timeline. Basically like a replay button for the LLM to see where it went wrong.
There is also a strict rule: no business logic in useEffect.
React hooks can obfuscate logic inside the render loop, which is hard to trace for LLMs in complex apps. In the framework, logic lives in middleware. Data flow is linear. If something breaks, you (or the LLM) know exactly where it broke.
Testing is usually where AI builds fall apart. Selenium/Puppeteer are flaky and slow.
The framework creates a "Headless" environment. Since the Router is just Redux state, we can run full integration tests without a browser: fast execution, 100% coverage
This structure creates a genuine feedback loop for Agents.
Usually, AI generates code and hopes it works. With Pi, because everything is deterministic, the AI can: dispatch an action > inspect the resulting state > read the error logs > Fix its own code
In the framework, Redux isn't just for data state - it's the whole application, e.g. routing, data, UX, UI, everything. All effects are triggered by actions.
This gives the AI a serialised, step-by-step audit trail of exactly what happened and when.