The Private AI Lab
The Private AI Lab is a podcast where we explore the future of Artificial Intelligence behind the firewall. Hosted by Johan van Amersfoort, each episode invites industry experts, innovators, and thought leaders to discuss how Private AI is reshaping enterprises, technology, and society.
From data sovereignty to air-gapped deployments, from GPUs to governance – this podcast uncovers the real-world experiments, failures, and breakthroughs that define the era of Private AI.


The Private AI Lab is a monthly podcast where we explore the future of Artificial Intelligence behind the firewall. Hosted by Johan from Johan.ml, each episode invites industry experts, innovators, and thought leaders to discuss how Private AI is reshaping enterprises, technology, and society.
From data sovereignty to air-gapped deployments, from GPUs to governance — this podcast uncovers the real-world experiments, failures, and breakthroughs that define the era of Private AI.
🎙️ New episode every month.
🌐 More at Johan.ml
In this episode, Johan is joined by long-time colleague Sander Hardewijnen to pull back the curtain on Project Q9 — an ambitious internal project at ITQ that combines a Unitree Go 2 Pro robotic dog, private AI, computer vision, and modern cloud-native development practices.
From gesture recognition trained on 30,000 hand images to a Skynet-obsessed dog posting on LinkedIn, this episode is a deep dive into what happens when you give great engineers a suitcase full of robot and say, ”see where it goes.”
The conversation also covers the state of open-source AI coding assistants (OpenClaw vs NemoClaw), the realities of vibe coding in a production context, and what partner platforms like Red Hat OpenShift AI and SUSE AI actually enable beyond conversational AI.
Sander's blog: https://harre.dev
Q9's LinkedIn page: https://www.linkedin.com/in/q9-the-dog-2206863b1/
Chapters
00:00 Welcome & Introduction01:20 Icebreaker: Best AI Fail02:12 NemoClaw vs OpenClaw: Security & Sandboxing04:49 Running OpenClaw in an Isolated VLAN05:32 OpenClaw as a Personal Assistant: Home Assistant, News & Efteling API09:11 OpenClaw in the ITQ WhatsApp Group11:10 Introducing Project Q913:22 Why Robotics + Cloud-Native + AI?16:16 Technical Anatomy of Q918:30 Partner Platform Showcase: Broadcom, Red Hat & SUSE19:20 Debunking the GPU Myth23:05 Building the Gesture Recognition Model25:00 Training Progression: Epochs, Accuracy & Landmarks30:21 Hand Landmark Detection & the Gesture Pipeline32:34 Crowd Reactions at KubeCon33:57 Fine-Tuning vs Training From Scratch36:16 Use Case 2: Q9's LLM-Powered LinkedIn Persona40:41 Running LLMs on Partner Inference Platforms42:26 What's Next for Q9?43:44 Digital Twins in NVIDIA Omniverse + ROS245:10 Key Takeaways48:53 Responsible Vibe Coding49:58 Open-Sourcing Q9 — Coming Soon

