Skip to content

MCP Configuration

The ContextKit MCP server can be configured for any MCP-compatible client. You can run it locally (stdio or self-hosted HTTP) or connect to the hosted endpoint at api.runcontext.dev/mcp.

ModeTransportUse case
Local stdiostdin/stdoutSingle-user, local development
Local HTTPStreamable HTTPSelf-hosted, shared team server
HostedStreamable HTTPManaged cloud, no server to run

For the hosted endpoint, replace {org} with your organization slug and set CONTEXTKIT_API_KEY to your API key (available from the ContextKit dashboard).


.claude/mcp.json
{
"mcpServers": {
"contextkit": {
"command": "npx",
"args": ["@runcontext/cli", "serve", "--stdio"],
"cwd": "/path/to/your/project"
}
}
}

.cursor/mcp.json
{
"mcpServers": {
"contextkit": {
"command": "npx",
"args": ["@runcontext/cli", "serve", "--stdio"],
"cwd": "/path/to/your/project"
}
}
}

VS Code supports MCP servers via its built-in chat/copilot agent framework. Configuration lives in .vscode/mcp.json.

.vscode/mcp.json
{
"mcpServers": {
"contextkit": {
"command": "npx",
"args": ["@runcontext/cli", "serve", "--stdio"],
"cwd": "${workspaceFolder}"
}
}
}

Windsurf reads MCP configuration from ~/.codeium/windsurf/mcp_config.json.

~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"contextkit": {
"command": "npx",
"args": ["@runcontext/cli", "serve", "--stdio"],
"cwd": "/path/to/your/project"
}
}
}

OpenAI Codex CLI reads MCP configuration from a mcp.json file or inline config.

mcp.json
{
"mcpServers": {
"contextkit": {
"command": "npx",
"args": ["@runcontext/cli", "serve", "--stdio"],
"cwd": "/path/to/your/project"
}
}
}

Any MCP-compatible client can connect using either transport.

Run the MCP server as a subprocess:

Terminal window
npx @runcontext/cli serve --stdio

The server communicates over stdin/stdout using the MCP JSON-RPC protocol.

Start the server in HTTP mode:

Terminal window
npx @runcontext/cli serve --http --port 3000

Then point your client at:

http://localhost:3000/mcp

Send MCP JSON-RPC requests to the hosted endpoint:

POST https://api.runcontext.dev/mcp/{org}
Authorization: Bearer <your-api-key>
Content-Type: application/json

The hosted endpoint uses Streamable HTTP transport. Example initialize request:

{
"jsonrpc": "2.0",
"id": 1,
"method": "initialize",
"params": {
"protocolVersion": "2025-03-26",
"capabilities": {},
"clientInfo": {
"name": "my-client",
"version": "1.0.0"
}
}
}

Once configured, your AI tool should discover the ContextKit resources and tools automatically. You can verify by asking the agent:

“What MCP resources are available?”

It should list the context:// resources. You can also test a tool call:

“Search my semantic layer for revenue metrics.”

The agent should call context_search and return results from your context directory.

The serve command accepts the following flags:

FlagDescriptionDefault
--httpUse HTTP transport (default is stdio)Off
--port <number>Port for HTTP server3000
--host <address>Host address to bind0.0.0.0
--context-dir <path>Path to context files./context