Agent Tracing
Gain deep visibility into your agent workflows and LLM interactions with OpenTelemetry-based tracing.
What is Agent Tracing?
Agent Tracing in Qualifire provides a detailed, end-to-end view of your AI agent’s operations. By leveraging the open standard of OpenTelemetry (OTLP), you can track every step of a complex workflow, from the initial prompt to the final output, including all intermediate LLM calls, tool usage, and decision-making processes.
This powerful observability feature allows you to:
- Debug complex agent behaviors: Pinpoint the exact source of errors, latency, or unexpected outputs.
- Analyze performance: Identify bottlenecks and optimize the performance of your agents.
- Monitor costs: Track token usage and cost for each step in a workflow.
- Ensure reliability: Understand how your agent chains and tool integrations are functioning in production.
How It Works
Qualifire’s tracing is built on the foundations of OpenTelemetry, an open-source observability framework.
Core Concepts: Traces and Spans
Core Concepts: Traces and Spans
- Trace: Represents an entire end-to-end workflow or transaction. For example, a single user request to your chatbot would constitute one trace.
- Span: Represents a single operation or unit of work within a trace. A trace is composed of one or more spans. For example, an LLM call, a database query, or a tool execution would each be a span.
- Span Events: These are timestamped events that occur within a span, providing additional context.
OpenTelemetry (OTLP) Endpoint
OpenTelemetry (OTLP) Endpoint
Qualifire exposes an OTLP-compatible endpoint at /telemetry/traces
.
This means you can use any OpenTelemetry-compliant client or SDK to send trace
data directly to our platform, allowing for seamless integration with your
existing observability setup.
Integrating Your Application
To get started, you can use our Python SDK which simplifies the process of instrumenting your application and sending trace data to Qualifire.
1. Installation
First, install the Qualifire client SDK:
2. Configuration
Next, configure the SDK in your application’s entrypoint. This will automatically instrument popular libraries like OpenAI and LangChain to send traces to Qualifire.
The qualifire
package automatically detects and instruments
supported libraries like OpenAI
, LangChain
, and Anthropic
upon
configuration.
Visualizing Traces
Once your application is instrumented, traces will appear in the Qualifire dashboard. Our UI provides a rich visualization of your traces, including:
- A hierarchical tree view of all spans within a trace.
- Detailed summaries of performance, cost, and governance metrics.
- In-depth analytics for each span, including attributes, events, and linked model invocations.