Meet
MOBILE.NODE — my portable AI sidekick running entirely offline on a small tablet with
Ubuntu Desktop.
In this demonstration,
ADI listens with
Whisper, thinks with
Llama 3.2 1B, and talks back using
CoquiTTS, turning an ordinary tablet into a field-ready AI companion.
Featured capabilities:
- Voice-activated commands with instant AI responses
- Natural speech output for immersive interaction
- Fast on-device inference with Llama 3.2 1B
This is part of my ongoing
theLAB project, where I experiment with distributed AI nodes, edge computing, and creative human-AI interfaces.