Drop in — OpenAI and MCP
Two integration paths — an OpenAI-compatible endpoint for apps already on the OpenAI SDK, and an MCP server that exposes Moonborn personas as Model Context Protocol resources to IDEs and agent frameworks.
Two ways in
If you already have an OpenAI-based app, swapping to Moonborn is a one-line change. If you build with Claude or agent frameworks that speak Model Context Protocol, expose Moonborn personas as MCP resources.
Both paths are first-class. Pick the one that matches your existing wiring.
Drop in — OpenAI-compatible
Moonborn hosts an OpenAI-compatible chat/completions endpoint. The model name is the persona ID; everything else (streaming, tool use, function calling) passes through.
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: process.env.MOONBORN_API_KEY,
baseURL: 'https://api.moonborn.co/v1/openai',
});
const response = await client.chat.completions.create({
model: 'persona://persona_mert_abc123',
messages: [{ role: 'user', content: 'What drives you?' }],
});
console.log(response.choices[0].message.content);What changes:
- Base URL —
https://api.moonborn.co/v1/openai. - API key env var —
MOONBORN_API_KEY. - Model field —
persona://<persona_id>instead ofgpt-4.
What stays:
- Streaming and non-streaming both supported.
- Tool calling and function calling pass through.
GET /v1/modelslists your workspace personas as models.- Rate-limit headers, token counts, and error shapes match the OpenAI conventions.
Moonborn-specific metadata (drift score, layer attribution) rides on x-moonborn-* response headers — safely ignored by OpenAI clients.
What's not compatible: image generation, embeddings, audio. Those aren't Moonborn's domain — keep your OpenAI client side-by-side for them. Read more in OpenAI-compatible.
MCP — personas as resources
Moonborn exposes a Model Context Protocol server at https://api.moonborn.co/v1/mcp. Each persona is an MCP resource (system prompt, voice fingerprint, DNA available to the client). A chat tool initiates a persona-scoped chat session.
Typical client config (Claude for VS Code, Cursor, JetBrains, or an MCP-compatible agent host):
{
"mcpServers": {
"moonborn": {
"transport": "https",
"url": "https://api.moonborn.co/v1/mcp",
"headers": { "Authorization": "Bearer ${MOONBORN_API_KEY}" }
}
}
}What the MCP server is for:
- IDE integrations — Claude in VS Code, Cursor, or JetBrains pulling persona context for code review, brand-voice writing, or fictional character roleplay.
- Agent framework hosts — LangChain, LlamaIndex, Anthropic Managed Agents that speak MCP and want a persona resource pool.
What the MCP server is not: a general LLM router, a full agent host, or a replacement for the chat completions endpoint. It serves personas to MCP clients. That's the scope.
When to use which
| If you have… | Use |
|---|---|
| An existing OpenAI-based app | OpenAI-compatible endpoint |
| Claude / MCP-compatible IDE or agent framework | MCP server |
| Custom HTTP client, no OpenAI footprint | Native REST + SDKs (Quickstart) |
Tier
OpenAI-compatible: Free tier and up. MCP server: Team tier and up.
Next
- Read OpenAI-compatible for the full integration page.
- Browse SDKs in SDKs.
- Reference docs on API reference.