The Vercel AI SDK is a powerful toolkit for building AI-powered applications. Qualifire integrates seamlessly with the Vercel AI SDK by acting as a proxy layer, enabling you to add guardrails, evaluations, and observability to your AI applications.
Create an account + Generate an API Key
Log into qualifire or create an account. Once you have an account, you
can generate an API key. Set environment variables
Set the following environment variables in your project:QUALIFIRE_API_KEY=<your Qualifire API key>
QUALIFIRE_BASE_URL=https://proxy.qualifire.ai/api/providers/openai
Install the Vercel AI SDK
Install the Vercel AI SDK and your preferred provider SDK:npm install ai @ai-sdk/openai zod
npm install ai @ai-sdk/anthropic zod
npm install ai @ai-sdk/google zod
Configure the provider client with Qualifire
Configure your AI provider to route requests through Qualifire by setting the baseURL and adding the Qualifire API key header:import { createOpenAI } from "@ai-sdk/openai";
const openaiClient = createOpenAI({
baseURL:
process.env.QUALIFIRE_BASE_URL ||
"https://proxy.qualifire.ai/api/providers/openai",
headers: {
"X-Qualifire-API-Key": process.env.QUALIFIRE_API_KEY || "",
},
});
// Use with any model
const model = openaiClient("gpt-4o");
import { createAnthropic } from "@ai-sdk/anthropic";
const anthropicClient = createAnthropic({
baseURL:
process.env.QUALIFIRE_BASE_URL ||
"https://proxy.qualifire.ai/api/providers/anthropic",
headers: {
"X-Qualifire-API-Key": process.env.QUALIFIRE_API_KEY || "",
},
});
// Use with any model
const model = anthropicClient("claude-sonnet-4-20250514");
import { createGoogleGenerativeAI } from "@ai-sdk/google";
const googleClient = createGoogleGenerativeAI({
baseURL:
process.env.QUALIFIRE_BASE_URL ||
"https://proxy.qualifire.ai/api/providers/google",
headers: {
"X-Qualifire-API-Key": process.env.QUALIFIRE_API_KEY || "",
},
});
// Use with any model
const model = googleClient("gemini-2.0-flash");
Usage Examples
Basic Text Generation
import { generateText } from "ai";
import { createOpenAI } from "@ai-sdk/openai";
const openaiClient = createOpenAI({
baseURL: "https://proxy.qualifire.ai/api/providers/openai",
headers: {
"X-Qualifire-API-Key": process.env.QUALIFIRE_API_KEY || "",
},
});
const { text } = await generateText({
model: openaiClient("gpt-4o"),
prompt: "What is the capital of France?",
});
console.log(text);
import { generateText } from "ai";
import { createAnthropic } from "@ai-sdk/anthropic";
const anthropicClient = createAnthropic({
baseURL: "https://proxy.qualifire.ai/api/providers/anthropic",
headers: {
"X-Qualifire-API-Key": process.env.QUALIFIRE_API_KEY || "",
},
});
const { text } = await generateText({
model: anthropicClient("claude-sonnet-4-20250514"),
prompt: "What is the capital of France?",
});
console.log(text);
import { generateText } from "ai";
import { createGoogleGenerativeAI } from "@ai-sdk/google";
const googleClient = createGoogleGenerativeAI({
baseURL: "https://proxy.qualifire.ai/api/providers/google",
headers: {
"X-Qualifire-API-Key": process.env.QUALIFIRE_API_KEY || "",
},
});
const { text } = await generateText({
model: googleClient("gemini-2.0-flash"),
prompt: "What is the capital of France?",
});
console.log(text);
Streaming Responses
import { streamText } from "ai";
import { createOpenAI } from "@ai-sdk/openai";
const openaiClient = createOpenAI({
baseURL: "https://proxy.qualifire.ai/api/providers/openai",
headers: {
"X-Qualifire-API-Key": process.env.QUALIFIRE_API_KEY || "",
},
});
const result = streamText({
model: openaiClient("gpt-4o"),
prompt: "Write a short poem about coding.",
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
import { streamText } from "ai";
import { createAnthropic } from "@ai-sdk/anthropic";
const anthropicClient = createAnthropic({
baseURL: "https://proxy.qualifire.ai/api/providers/anthropic",
headers: {
"X-Qualifire-API-Key": process.env.QUALIFIRE_API_KEY || "",
},
});
const result = streamText({
model: anthropicClient("claude-sonnet-4-20250514"),
prompt: "Write a short poem about coding.",
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
import { streamText } from "ai";
import { createGoogleGenerativeAI } from "@ai-sdk/google";
const googleClient = createGoogleGenerativeAI({
baseURL: "https://proxy.qualifire.ai/api/providers/google",
headers: {
"X-Qualifire-API-Key": process.env.QUALIFIRE_API_KEY || "",
},
});
const result = streamText({
model: googleClient("gemini-2.0-flash"),
prompt: "Write a short poem about coding.",
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
Here’s a complete example of building an AI agent with tools using the Vercel AI SDK and Qualifire:
import { tool, UIMessage, stepCountIs, createAgentUIStreamResponse } from "ai";
import { Experimental_Agent as Agent } from "ai";
import { createOpenAI } from "@ai-sdk/openai";
import { z } from "zod";
export const maxDuration = 30;
export async function POST(req: Request) {
const { messages }: { messages: UIMessage[] } = await req.json();
const openaiClient = createOpenAI({
baseURL:
process.env.QUALIFIRE_BASE_URL ||
"https://proxy.qualifire.ai/api/providers/openai",
headers: {
"X-Qualifire-API-Key": process.env.QUALIFIRE_API_KEY || "",
},
});
const myAgent = new Agent({
model: openaiClient("gpt-4o"),
instructions: `You are a helpful assistant that can look up weather information.`,
stopWhen: stepCountIs(5),
tools: {
getWeather: tool({
description: "Get the current weather for a location",
inputSchema: z.object({
city: z.string().describe("The city to get weather for"),
}),
execute: async ({ city }) => {
// Simulated weather data
return { temperature: 72, conditions: "sunny", city };
},
}),
},
toolChoice: "auto",
});
return await createAgentUIStreamResponse({
agent: myAgent,
uiMessages: messages,
});
}
import { tool, UIMessage, stepCountIs, createAgentUIStreamResponse } from "ai";
import { Experimental_Agent as Agent } from "ai";
import { createAnthropic } from "@ai-sdk/anthropic";
import { z } from "zod";
export const maxDuration = 30;
export async function POST(req: Request) {
const { messages }: { messages: UIMessage[] } = await req.json();
const anthropicClient = createAnthropic({
baseURL:
process.env.QUALIFIRE_BASE_URL ||
"https://proxy.qualifire.ai/api/providers/anthropic",
headers: {
"X-Qualifire-API-Key": process.env.QUALIFIRE_API_KEY || "",
},
});
const myAgent = new Agent({
model: anthropicClient("claude-sonnet-4-20250514"),
instructions: `You are a helpful assistant that can look up weather information.`,
stopWhen: stepCountIs(5),
tools: {
getWeather: tool({
description: "Get the current weather for a location",
inputSchema: z.object({
city: z.string().describe("The city to get weather for"),
}),
execute: async ({ city }) => {
// Simulated weather data
return { temperature: 72, conditions: "sunny", city };
},
}),
},
toolChoice: "auto",
});
return await createAgentUIStreamResponse({
agent: myAgent,
uiMessages: messages,
});
}
import { tool, UIMessage, stepCountIs, createAgentUIStreamResponse } from "ai";
import { Experimental_Agent as Agent } from "ai";
import { createGoogleGenerativeAI } from "@ai-sdk/google";
import { z } from "zod";
export const maxDuration = 30;
export async function POST(req: Request) {
const { messages }: { messages: UIMessage[] } = await req.json();
const googleClient = createGoogleGenerativeAI({
baseURL:
process.env.QUALIFIRE_BASE_URL ||
"https://proxy.qualifire.ai/api/providers/google",
headers: {
"X-Qualifire-API-Key": process.env.QUALIFIRE_API_KEY || "",
},
});
const myAgent = new Agent({
model: googleClient("gemini-2.0-flash"),
instructions: `You are a helpful assistant that can look up weather information.`,
stopWhen: stepCountIs(5),
tools: {
getWeather: tool({
description: "Get the current weather for a location",
inputSchema: z.object({
city: z.string().describe("The city to get weather for"),
}),
execute: async ({ city }) => {
// Simulated weather data
return { temperature: 72, conditions: "sunny", city };
},
}),
},
toolChoice: "auto",
});
return await createAgentUIStreamResponse({
agent: myAgent,
uiMessages: messages,
});
}
Key Configuration
The key to integrating Qualifire with the Vercel AI SDK is configuring your provider client with:
baseURL: Point to the Qualifire proxy endpoint for your provider
headers: Include the X-Qualifire-API-Key header with your API key
| Provider | Base URL |
|---|
| OpenAI | https://proxy.qualifire.ai/api/providers/openai |
| Anthropic | https://proxy.qualifire.ai/api/providers/anthropic |
| Google | https://proxy.qualifire.ai/api/providers/google |
All your AI requests will now be routed through Qualifire, enabling
guardrails, evaluations, tracing, and other observability features without any
changes to your application logic.
Next Steps