Remix.run Logo
Ozzie_osman 5 hours ago

If the team is here, would love to understand how it compares to something like Amplitude's agent analytics (https://amplitude.com/ai-agents).

ttpost 5 hours ago | parent [-]

Yeah, this is a confusing one on wording. TLDR: Amplitude is analytics for your web/product data, Voker is analytics for your agent data.

We call Amplitude's feature an "AI Analyst". Essentially Amplitude is layering a LLM copilot on top of their own product - so you don't have to click the buttons or write reports to get insights.

We're an analytics platform built for tracking your agents. Different products with different problems they're solving.

Not sure if this helps, but essentially Amplitude could use Voker to track how well their AI Analyst agent product is actually working!

adrianisbored 4 hours ago | parent [-]

I think the link is off above, but they're thinking of Amplitude's not yet GA agent analytics, not their general analytics: https://amplitude.com/blog/agent-analytics

ttpost 3 hours ago | parent [-]

Thanks for clarifying - yes this is a much closer analog to what we're building. That being said, we haven't heard from anyone using it or tried it ourselves yet so can't speak to quality comparison.

From what I can tell in this video, it still seems like Amplitude is focusing on the obs trace details (latency, tokens, etc).

They don't seem to go as deep (or at least don't highlight) as much of the semantic data processing and detection we're doing (intents, corrections, resolutions) - and creating higher level classifications and insights from those. We're completely purpose built for monitoring agent products, so we're striving to do more than just visualizations, we intend to be best-in-category at the actual automated annotations and analysis of agent<>user interaction data.