LangChain logo

LangChain with EU data residency.

Set the base URL on ChatOpenAI to EUrouter and your existing chains, agents, and retrievers route through European infrastructure. No code changes beyond that.

GDPR compliant
Drop-in replacement
EU infrastructure
Visit LangChain

What is LangChain?

LangChain is a framework for building applications powered by language models. It gives you building blocks for chains (sequences of LLM calls), agents (LLMs that decide which tools to use), retrieval-augmented generation, and memory. It's one of the most widely used AI frameworks, with both Python and JavaScript versions.

Under the hood, LangChain's OpenAI integration uses the standard OpenAI client. That means you can point it at EUrouter by setting the base_url parameter on ChatOpenAI. Your chains, agents, retrievers, and tools all keep working. The only difference is that every LLM call now routes through European servers.

Quick to integrate

A few lines of code is all it takes. Swap your base URL and you are routed through EUrouter.

1from langchain_openai import ChatOpenAI 2 3llm = ChatOpenAI( 4 model="gpt-4o", 5 api_key="eur-...", 6 base_url="https://api.eurouter.ai/v1", 7) 8 9response = llm.invoke("Explain GDPR Article 44") 10print(response.content)

Get started in minutes

Follow these steps to connect your application to EUrouter.

1

Get your EUrouter API key

Sign up and create an API key from your dashboard.

Sign up for free
2

Install LangChain

Install the LangChain OpenAI integration package.

1pip install langchain-openai
3

Set the base URL

Point the ChatOpenAI client to EUrouter by setting base_url.

1from langchain_openai import ChatOpenAI 2 3llm = ChatOpenAI( 4 model="gpt-4o", 5 api_key="eur-...", 6 base_url="https://api.eurouter.ai/v1", 7)
4

Run your chain

Use LangChain as usual. Requests automatically route through EU infrastructure.

1response = llm.invoke("Explain GDPR Article 44") 2print(response.content)

Why use LangChain with EUrouter

Drop-in replacement

LangChain's ChatOpenAI class wraps the standard OpenAI client. Change base_url and api_key, and everything else stays the same. Your chain definitions, agent configs, tool integrations, output parsers, none of that needs to change.

  • Chains, agents, retrievers, and output parsers all work without modification
  • Streaming, function calling, and tool use are fully supported
  • Works with ChatOpenAI and any OpenAI-compatible LLM class
  • LangGraph workflows route through EUrouter the same way

Multi-model agents

LangChain makes it easy to assign different models to different parts of your pipeline. With EUrouter, you can use GPT-4o for reasoning, Claude for writing, and Mistral for quick classification, all through a single API key and all routed through Europe.

  • Assign different models per chain step or agent role
  • No separate provider configuration or API key management
  • EUrouter's smart routing can also pick the optimal model automatically
  • Compare model performance across your pipeline from one dashboard

EU-compliant RAG pipelines

If you're building RAG with LangChain, your LLM calls are often the compliance gap. Your documents and vector store might be hosted in Europe, but the inference call goes to the US. EUrouter closes that gap by keeping LLM and embedding requests within EU infrastructure.

  • LLM calls for generation and summarization route through EU servers
  • Embeddings API is also available through EUrouter for document indexing
  • Works with any LangChain retriever (Chroma, Pinecone, pgvector, etc.)
  • Keeps your full RAG pipeline within European jurisdiction

Integrate AI without GDPR risk.

You need AI that won’t create compliance headaches. Your data stays in the EU, GDPR is enforced by default, and every request is routed for the best balance of cost, latency, and uptime, reducing risk while improving performance.

Get Your API Key
GDPR by default
EU data residency
Smart routing
Claim €15 free credits·300 left
Sign up free →