Overview
Generate Model Context Protocol servers from your OpenAPI specification
The Model Context Protocol is an open, JSON-RPC-based standard that defines how applications provide context and actions to large language models (LLMs). It acts as a universal adapter to external data (like Google Drive, Git repos, or databases) and capabilities (like sending emails or running SQL queries).
Stainless automatically generates MCP servers that use a code tool architecture, enabling your API to work seamlessly with Claude Desktop, Cursor, and other MCP clients.
Configuration
Section titled “Configuration”If you don’t have a Stainless project yet, create one. The MCP server is a subpackage of your TypeScript SDK, so choose TypeScript as your first language.
Once you have a project, click “Add SDKs” and choose “MCP Server”. This updates your Stainless config:
targets: typescript: package_name: my-company-sdk production_repo: my-org/my-sdk-typescript edition: typescript.2025-10-10 options: mcp_server: package_name: my-company-mcpThe MCP server is generated as a subpackage within your TypeScript SDK at packages/mcp-server.
For a complete list of configuration options, see the TypeScript target reference.
Stainless-generated MCP servers include two tools: a code execution tool and a docs search tool. This architecture is more accurate and token-efficient than architectures that expose one tool per API method. A small number of tools take up less space in an LLM’s context window, and multiple operations can be performed in a single code tool call with arbitrary complexity.
Code execution tool
Section titled “Code execution tool”Your MCP server exposes a tool that accepts TypeScript code. The code provided by the LLM runs against your Stainless SDK. Stainless runs the code in a sandboxed environment, so you don’t need to configure your own sandbox.
Docs search tool
Section titled “Docs search tool”The MCP server provides a docs search tool to look up documentation for how to use your Stainless-generated SDKs. LLMs use this tool to answer questions about SDK usage and write code to interact with your API.
The Stainless API serves the documentation in Markdown format optimized for LLM consumption, based on your project’s latest OpenAPI spec and Stainless config. This is a public API endpoint enabled for all projects by default. You can change this setting in the SDK Studio under Release > Setup OpenAPI publishing > Expose experimental SDK docs search API.
Editions
Section titled “Editions”MCP servers use the TypeScript SDK editions. See the TypeScript target for edition information.
Installation
Section titled “Installation”Different desktop clients enable MCP servers in different ways:
MCPB files: Some clients support MCPB files for quick one-click installation.
Deeplinks: IDEs like Cursor, VS Code, and Claude Code allow adding MCP servers via deeplinks. If your MCP server is published to npm, you’ll find installation buttons in your README.md file (located in packages/mcp-server).
mcpServers.json: For clients that use this format, add the following to your configuration:
{ "mcpServers": { "my_org_api": { "command": "npx", "args": ["-y", "my-org-mcp"], "env": { "MY_API_KEY": "your-api-key" } } }}Publishing
Section titled “Publishing”Publish to GitHub
Section titled “Publish to GitHub”For every release, Stainless creates an MCPB file that allows users to install your MCP server locally with one click. Provide a link to this file or include it with your installer.
Publish to npm
Section titled “Publish to npm”To publish your MCP server to npm, set up a production repo and publish your SDK by creating a release.
The MCP server is published at the same time as your TypeScript SDK and with the same version, but in a separate npm package. By default, the package name is <your-npm-package>-mcp, but you can customize this in the target options.
Once published, run it from the command line using npx:
# Pull and run the latest versionnpx -y my-org-name-mcp@latest
# Run a specific versionnpx -y my-org-name-mcp@1.2.3
# With environment variablesexport MY_API_KEY=your-api-keynpx -y my-org-name-mcp@latestPublish Docker images
Section titled “Publish Docker images”For easier distribution and deployment, you can publish Docker images for your MCP server. See Docker publishing for detailed setup instructions.
Remote deployment
Section titled “Remote deployment”The examples above run the MCP server locally on the user’s computer. For web apps like Claude.ai or agentic workflows like LangChain, you can deploy a remote MCP server instead. See Remote deployment for details on:
- Stainless-hosted servers
- Self-hosting options
- Authentication (header-based, OAuth)
- Cloudflare Workers for APIs without OAuth support
Next steps
Section titled “Next steps”For more details, see our blog post on generating MCP servers from OpenAPI specs.