Pre-Requisites
- Create an account on Qualifire
- Get your API key and webhook URL from the Qualifire dashboard
Quick Start
Use just 2 lines of code to instantly log your responses across all providers with Qualifire.Using with LiteLLM Proxy
Environment Variables
Both environment variables are required. Get your API key and webhook URL from the Qualifire dashboard.
| Variable | Description |
|---|---|
QUALIFIRE_API_KEY | Your Qualifire API key for authentication |
QUALIFIRE_WEBHOOK_URL | The Qualifire webhook endpoint URL from your dashboard |
What Gets Logged?
Request Data
Request Data
Request messages, parameters, and model configuration sent to the LLM provider.
Response Data
Response Data
Response content, metadata, finish reason, and any tool calls returned by the model.
Usage & Performance
Usage & Performance
Token usage statistics, latency metrics, cost data, and model information. The full LiteLLM Standard Logging Payload is sent on each successful LLM API call.
- Run evaluations to detect hallucinations, toxicity, and policy violations
- Set up guardrails to block or modify responses in real-time
- View traces across your entire AI pipeline
- Track performance and quality metrics over time