Pre-Requisites
- Create an account on Qualifire
- Get your API key and webhook URL from the Qualifire dashboard
Quick Start
Use just 2 lines of code to instantly log your responses across all providers with Qualifire.Using with LiteLLM Proxy
1
Setup config.yaml
Configure the LiteLLM proxy with Qualifire eval callback:
2
Start the proxy
3
Test it!
Environment Variables
| Variable | Description |
|---|---|
QUALIFIRE_API_KEY | Your Qualifire API key for authentication |
QUALIFIRE_WEBHOOK_URL | The Qualifire webhook endpoint URL from your dashboard |
What Gets Logged?
The LiteLLM Standard Logging Payload is sent to your Qualifire endpoint on each successful LLM API call. This includes:- Request messages and parameters
- Response content and metadata
- Token usage statistics
- Latency metrics
- Model information
- Cost data
- Run evaluations to detect hallucinations, toxicity, and policy violations
- Set up guardrails to block or modify responses in real-time
- View traces across your entire AI pipeline
- Track performance and quality metrics over time