| ▲ | simple10 4 hours ago | ||||||||||||||||
I'm not actually reading the jsonl files. Agents Observe just uses hooks and sends all hook data the server (running as a docker container by default). Basic flow: 1. Plugin registers hooks that call a dump pipe script that sends hook events data to api server 2. Server parses events and stores them in sqlite by session and agent id - mostly just stores data, minimal processing 3. Dashboard UI uses websockets to get real-time events from the server 4. UI does most of the heavy lifting by parsing events, grouping by agent / sub-agent, extracting out tool calls to dynamically create filters, etc. It took a lot of iterations to keep things simple and performant. You can easily modify the app/client UI code to fully customize the dashboard. The API app/server is intentionally unopinionated about how events will be rendered. This was by design to add support for other agent events soon. | |||||||||||||||||
| ▲ | ivaivanova 4 hours ago | parent [-] | ||||||||||||||||
The hooks approach seems much cleaner for real-time. Did you run into any issues with the blocking hooks degrading performance before you switched to background? | |||||||||||||||||
| |||||||||||||||||