What is Vercel AI SDK?
The Vercel AI SDK is a TypeScript toolkit for building AI-powered user interfaces. It works with React, Next.js, Svelte, and Vue, and handles the hard parts of AI in the browser: streaming responses token by token, managing chat state, calling tools, and rendering structured output.
If you're building a chatbot, an AI writing assistant, or any web app that talks to an LLM, this is probably the best developer experience available right now. It has hooks like useChat and useCompletion that handle the streaming plumbing, plus server-side utilities like generateText and streamText for your API routes.
The SDK uses a provider pattern. You create an OpenAI provider with createOpenAI, pass it your EUrouter base URL, and every model call from that provider goes through European infrastructure. Your React components don't change at all.
Quick to integrate
A few lines of code is all it takes. Swap your base URL and you are routed through EUrouter.
1import { createOpenAI } from "@ai-sdk/openai";
2import { generateText } from "ai";
3
4const eurouter = createOpenAI({
5 apiKey: "eur-...",
6 baseURL: "https://api.eurouter.ai/v1",
7});
8
9const { text } = await generateText({
10 model: eurouter("gpt-4o"),
11 prompt: "Explain GDPR",
12});
13console.log(text);Get started in minutes
Follow these steps to connect your application to EUrouter.
Install the AI SDK
Install the Vercel AI SDK and the OpenAI provider.
1npm install ai @ai-sdk/openaiCreate the provider
Use createOpenAI to configure EUrouter as the base URL.
1import { createOpenAI } from "@ai-sdk/openai";
2
3const eurouter = createOpenAI({
4 apiKey: process.env.EUROUTER_API_KEY,
5 baseURL: "https://api.eurouter.ai/v1",
6});Build your AI feature
Use generateText, streamText, or the useChat hook. All calls flow through EUrouter.
1import { streamText } from "ai";
2
3const result = streamText({
4 model: eurouter("gpt-4o"),
5 prompt: "Write a summary of EU AI regulations",
6});Why use Vercel AI SDK with EUrouter
Streaming UI with EU compliance
The AI SDK is built around streaming. Tokens arrive one at a time and your React components update in real time. With EUrouter, those streams flow through EU infrastructure end to end. The user experience is identical, but the data path stays within European jurisdiction.
- useChat and useCompletion hooks work with no modifications
- Server-sent events stream through EUrouter with the same low latency
- React Server Components and server actions are both supported
- Compatible with edge runtimes for globally distributed Next.js apps
Multi-provider for Next.js
A typical Next.js app might use GPT-4o for a chatbot, Claude for long-form writing, and a fast model for autocomplete suggestions. With EUrouter, you create one provider and switch models by passing a different model string. No separate API keys, no extra provider instances.
- One createOpenAI call gives you access to all models through EUrouter
- Switch models per route, per feature, or per user preference
- Unified error handling: every provider returns the same response format
- Smart routing can automatically select the best model for each request
Tool calling in Europe
The AI SDK has strong support for tool calling: you define tools with Zod schemas, the model decides when to call them, and the SDK handles the back-and-forth. With EUrouter, the model orchestration happens in Europe while your tool execution stays in your own infrastructure.
- AI SDK tool definitions with Zod schemas work without any changes
- Multi-step tool calling (model calls tool, gets result, calls another) is fully supported
- Tool execution happens on your servers; only the LLM reasoning goes through EUrouter
- The underlying model provider is abstracted away, so you can switch models freely