agents.osohq.cloud that sits between AI agents and their LLM providers. It captures the full content of agent sessions, including prompts, completions, tool calls, and metadata, and feeds them into Oso for monitoring and alerting.
What it captures
When agent traffic flows through the edge proxy, Oso records:- Prompts sent to the LLM
- Completions returned by the LLM
- Tool calls and their parameters
- Model metadata (which model was used, token counts)
- User and session identifiers
Supported agents
The edge proxy works with any agent that allows configuring a custom base URL for its LLM provider. Currently supported:| Agent | Environment |
|---|---|
| Claude Code | CLI |
| Codex | CLI |
| Gemini CLI | CLI |
| Claude Desktop | Desktop |
| Cursor | Desktop |
| Antigravity | Desktop |
Agents with hardcoded LLM endpoints that don’t support custom base URLs cannot use the edge proxy. For those agents, consider using the browser extension or EDR integration for discovery.
Setup
For step-by-step configuration instructions, see the Quickstart. The quickstart walks through:- Getting your Environment ID from the Oso UI
- Configuring your agent to route through
agents.osohq.cloud - Verifying sessions appear in your dashboard
When to use the edge proxy
The edge proxy provides full session-level monitoring for agents that support custom endpoints, capturing every prompt, completion, and tool call. For agents that cannot be routed through the proxy, you can use EDR and the browser extension for discovery.| Integration | Discovery | Session monitoring |
|---|---|---|
| Edge Proxy | Agents routed through proxy | Full session content |
| Browser Extension | Browser-based AI tools | Full session content for supported web apps |
| EDR | All agents installed on endpoints | No session content |
| Tailscale Aperture | Network-level agent traffic | Depends on configuration |