Agent Visualizer
Component library for visualizing audio from LiveKit agents. Evolved into shadcn/Agents UI with a larger team.
Context
In the early voice agent days of LiveKit we prototyped various visualizations across different models and contexts. As more customers built voice experiences, we saw a need to make audio visualization easier for developers.
Goals
- Make it easy for developers to visualize audio from their agents
- Provide an expressive component that could easily be tailored to their brand
- Build a component representing various agent states out of the box
- Scope something easily implementable across all Client SDKs
Approach
Scoping with engineering — The first step was working with engineering to frame the problem. We aligned on a few directions worth exploring, with the understanding that rapid prototyping would help us converge.

Sandbox + API design — I built a sandbox environment that could simulate a range of agent states — connecting, thinking, speaking, idle — and started prototyping. This included designing a component API flexible enough to support styling, theming, and different brand contexts.
Learning the audio domain — A key challenge was making audio look good. I experimented with frequency passes, bucketing amplitude into ranges, and mapping those values into visuals. The translation from raw audio signals into a smooth, expressive UI required iteration and tuning.
Grid-based component system — Once the core translation worked, I built a scalable grid system where points could be toggled on/off at adjustable speeds, giving us a flexible foundation to represent nuanced states.
Testing and Refinement
Prototypes were shared across the company and tested in real environments. To validate further, I shipped a version of the visualizer to the LiveKit homepage. Through this exploration, we narrowed focus to the BarVisualizer — it struck the best balance of simplicity and expressiveness.
Impact
- Community adoption: Developers used the visualizers in prototypes, demos, and production agents.
- Brand recognition: The BarVisualizer became a recognizable part of LiveKit's agent experience.
- Internal velocity: Teams across marketing, sales, and solutions engineering could spin up polished voice demos quickly.
- Foundation for multimodal: The work laid groundwork for extending into richer multimodal applications beyond audio.
Phase 2
Building off the original thinking, we began to invest in the evolution of our agent components which led to the development of Agents UI — integrating more visualizers and components to help developers build world-class AI experiences.
I had a smaller role in the design and engineering this time around, so I want to give a shoutout to the team who led the work. Together with another design engineer on my team and our brand team, we launched Agents UI on shadcn.
- Product lead: Matt Herzog
- Engineering lead: Thomas Yuill
- Brand lead: Austin Holmes-Couillard