Complete data flow from AI query to live instrumentation
The unified command-line interface for all Coral interactions. Includes the Coral Ask terminal assistant and manages the MCP Proxy.
Built into the Coral CLI, this lightweight bridge translates between the Model Context Protocol (stdio) used by AI assistants and the gRPC protocol used by the Colony server.
Central coordinator with MCP server, DuckDB storage, and AI orchestration. Provides the unified interface for all AI assistants to query distributed observability data.
Global registry that coordinates WireGuard key exchange and NAT traversal, enabling secure agent-to-colony connections across any network.
Local observers using eBPF, OTLP, and shell commands to gather telemetry. Run as a sidecar or node agent (DaemonSet) to monitor services with local DuckDB storage.
Advanced features like live probes and runtime instrumentation. Enables on-demand eBPF uprobes for deep debugging without redeploying.
Colony exposes a standard MCP server that works with any AI assistant: Claude Desktop, VS Code, Cursor, or custom applications. The coral mcp proxy translates between MCP protocol (stdio) and Buf Connect gRPC for type-safe communication.
All components connect via an encrypted WireGuard mesh coordinated by the Discovery Service. Works across any network boundary with no VPN configuration, firewall rules, or per-environment tooling required.
Agents run on each service (sidecar or node agent) with local DuckDB storage for recent data (~6hr rolling window). Colony polls agents for summaries, reducing network overhead while maintaining fast queries for detailed telemetry.
Start with Level 0 (zero-config eBPF), optionally add Level 1 (OTLP for OpenTelemetry apps), Level 2 (shell/exec diagnostics), and Level 3 (SDK Live Probes): App owners can integrate a language-specific SDK to launch a runtime bridge with the agent, enabling live probes and deep runtime control.