LiteLLM logo

LiteLLM with EU data residency.

Set api_base to EUrouter in your litellm.completion() calls and every model request routes through Europe. Works in direct mode and proxy mode.

GDPR compliant
Unified API
EU infrastructure
Visit LiteLLM

What is LiteLLM?

LiteLLM is a Python library that gives you a single interface for calling 100+ LLM providers. Instead of learning each provider's SDK (OpenAI, Anthropic, Cohere, Bedrock, etc.), you use litellm.completion() with a consistent API. It handles the translation between formats behind the scenes.

LiteLLM also works as a proxy server. You run it locally or on your infrastructure, and it exposes an OpenAI-compatible endpoint that routes to whatever provider you've configured. This is useful for teams that want a single endpoint for all their model access.

When you point LiteLLM at EUrouter, you're adding EU data residency to that unified interface. Set api_base to EUrouter, and every call through LiteLLM goes through European infrastructure. If you're already using LiteLLM's proxy mode, you configure EUrouter as the upstream endpoint and get compliance for free.

Quick to integrate

A few lines of code is all it takes. Swap your base URL and you are routed through EUrouter.

1import litellm 2 3response = litellm.completion( 4 model="openai/gpt-4o", 5 messages=[{"role": "user", "content": "Hello"}], 6 api_key="eur-...", 7 api_base="https://api.eurouter.ai/v1", 8) 9print(response.choices[0].message.content)

Get started in minutes

Follow these steps to connect your application to EUrouter.

1

Get your EUrouter API key

Sign up and create an API key from your dashboard.

Sign up for free
2

Install LiteLLM

Install the LiteLLM Python package.

1pip install litellm
3

Configure EUrouter as the base

Set the api_base to EUrouter when calling litellm.completion.

1import litellm 2 3response = litellm.completion( 4 model="openai/gpt-4o", 5 messages=[{"role": "user", "content": "Hello"}], 6 api_key="eur-...", 7 api_base="https://api.eurouter.ai/v1", 8)
4

Use any model

Switch models by changing the model parameter. EUrouter routes everything through EU infrastructure.

1# Use Mistral 2response = litellm.completion( 3 model="openai/mistral-large-latest", 4 messages=[{"role": "user", "content": "Hello"}], 5 api_key="eur-...", 6 api_base="https://api.eurouter.ai/v1", 7)

Why use LiteLLM with EUrouter

Unified interface, EU routing

LiteLLM already solves the "every provider has a different API" problem. EUrouter solves the "our data can't leave Europe" problem. Together, you get one function call for all models, with every request going through EU infrastructure. Your application code stays clean and your compliance team stays happy.

  • Every model available through EUrouter works through LiteLLM's unified API
  • One API key, one response format, one billing dashboard
  • Consistent error handling across all providers
  • EU-based routing for every call regardless of which model you choose

Fallback chains in Europe

LiteLLM lets you define fallback chains: if model A fails, try model B, then C. With EUrouter, every model in that chain routes through EU infrastructure. So even during failover, your data never leaves Europe. You effectively get two layers of redundancy: LiteLLM's fallback logic and EUrouter's own provider failover.

  • Every model in the fallback chain stays EU-compliant
  • EUrouter adds its own failover layer on top of LiteLLM's
  • Two layers of redundancy for mission-critical workloads
  • No data leaves European jurisdiction, even during provider outages

Proxy mode with EU compliance

LiteLLM's proxy mode gives you a local OpenAI-compatible endpoint. Configure it to forward to EUrouter, and every application in your stack talks to a local URL while the actual inference happens in Europe. Useful for organizations that want to centralize their LLM access.

  • Run LiteLLM as a local proxy that forwards to EUrouter's EU servers
  • Local request logging and caching reduce latency and add observability
  • Centralized model configuration: change models in one place, not in every app
  • All upstream model calls go through EUrouter's European infrastructure

Integrate AI without GDPR risk.

You need AI that won’t create compliance headaches. Your data stays in the EU, GDPR is enforced by default, and every request is routed for the best balance of cost, latency, and uptime, reducing risk while improving performance.

Get Your API Key
GDPR by default
EU data residency
Smart routing
Claim €15 free credits·300 left
Sign up free →