Convert OpenAPI Specs To MCP Servers

Converting OpenAPI specs to MCP servers allows AI models to interact with existing APIs through a standardized protocol. This article explains the conversion process, common challenges, and practical solutions based on our experience.

Jump to section

Jump to section

Jump to section

Model Context Protocol from OpenAPI Specs

Model Context Protocol (MCP) is a specification that defines how AI models interact with external tools. It standardizes how tools are described and executed in a machine-readable format. At runtime, an AI agent uses MCP to discover available tools and how to call them.

OpenAPI and MCP both use JSON Schema to describe data structures, but they serve different purposes:

  • OpenAPI: Describes REST APIs for documentation, SDKs, and testing

  • MCP: Makes endpoints callable by AI models in a tool-based format

Converting an OpenAPI spec to MCP means transforming REST endpoints into "tools" that AI models can discover and use. Each tool maps to an OpenAPI operation, with inputs and outputs described in a self-contained schema.

Here's a simple example of how an OpenAPI endpoint transforms into an MCP tool:

# OpenAPI Path
/weather/{city}:
  get:
    summary: Get weather for a city
    parameters:
      - name: city
        in: path
        required: true
        schema:
          type: string

# Becomes MCP Tool
name: getWeatherForCity
description: Get weather for a city
args:
  - name: city
    type: string
    required: true

OpenAPI endpoints don't perfectly map to tools

While both MCP tools and OpenAPI specs use JSON Schema, you can't just copy them over as-is. You have to combine the request body, path parameters, query parameters, and header parameters all into one schema, and handle any naming collisions automatically.

For example, an OpenAPI endpoint might have:

  • A path parameter called id

  • A query parameter also called id

  • A body parameter with an id field

In MCP, these become a single input schema, requiring parameter renaming to avoid conflicts. Most conversion tools handle this by adding prefixes or suffixes to create unique parameter names.

Another challenge is that OpenAPI operations often include multiple response types based on status codes. MCP tools typically map to the success response (200-level status codes), leaving error handling to the server implementation.

Handle $refs and recursive references

OpenAPI schemas use $ref to point to chunks of reusable schema elsewhere in the file. However, MCP Tool schemas must be completely self-contained, meaning they cannot reference anything outside themselves.

This requires "dereferencing" - a process of resolving all references and inlining them into a single schema. The challenge comes with recursive references, where Schema A references Schema B, which references Schema A again.

For example, a comment schema that allows nested replies:

Comment:
  type: object
  properties:
    text:
      type: string
    replies:
      type: array
      items:
        $ref: '#/components/schemas/Comment'  # Recursive!

Solutions include:

  • Depth limiting: Stop resolving after a certain number of levels

  • Reference replacement: Replace recursive references with simplified versions

  • Schema flattening: Create non-recursive equivalents that preserve the essential structure

Most openapi to mcp mapping tools implement one of these strategies automatically.

Many APIs have too many endpoints to be imported at once

MCP currently works by loading all of the tool schemas for an MCP Server at once into its context. The LLM then chooses which tool to use based on your request.

This approach works well for APIs with a reasonable number of endpoints. However, many enterprise APIs have hundreds of endpoints, which creates several problems:

  • Context limits: LLMs have fixed context windows that can't hold all tool definitions

  • Selection complexity: Too many similar tools make it harder for the AI to choose correctly

  • Performance impact: Loading large schemas slows down response times

For example, an e-commerce API might have 50+ endpoints just for product management, and hundreds more for orders, customers, and inventory.

Handling large APIs dynamically

Selecting each tool you want to expose can be fine if you have a targeted task, but you might not know exactly which tools you want out of hundreds in a large API.

When converting large OpenAPI specs to MCP, consider these approaches:

  • Functional grouping: Convert only logically related endpoints (e.g., all user management endpoints)

  • On-demand loading: Implement a system that loads tool definitions as needed rather than all at once

  • Tool aggregation: Combine multiple fine-grained endpoints into fewer, more powerful tools

For example, instead of separate tools for getUser, updateUser, and deleteUser, you might create a single manageUser tool that handles all user operations based on an action parameter.

These openapi mcp large api strategies help manage complexity while still making the API accessible to AI agents.

MCP clients have different schema limitations

One of the harder issues was dealing with different clients. Claude Desktop unsurprisingly handles the MCP protocol pretty well, but it turns out that JSON Schemas are interpreted differently between various AI models and MCP clients today, and have different limitations.

Some common compatibility issues include:

  • Schema size limits: Some clients reject schemas above certain sizes

  • Feature support: Not all clients support advanced JSON Schema features like oneOf or allOf

  • Validation strictness: Some clients strictly validate schemas while others are more forgiving

  • Description handling: Clients vary in how they use schema descriptions to guide tool selection

For maximum compatibility, it's best to:

  • Keep schemas as simple as possible

  • Avoid advanced JSON Schema features when possible

  • Test with multiple MCP clients before deployment

Step-by-step conversion process

Converting an OpenAPI spec to an MCP server involves several discrete steps:

1. Prepare your OpenAPI specification

Start with a valid, well-documented OpenAPI specification. Ensure it includes clear descriptions for operations and parameters, as these help AI models understand when and how to use each tool.

For best results:

  • Add detailed descriptions to operations and parameters

  • Use consistent naming patterns

  • Resolve any validation errors in the OpenAPI spec

2. Choose a conversion tool

Several tools can convert OpenAPI specs to MCP servers:

Each tool has different features and output formats, so choose one that matches your requirements and technology stack.

3. Configure the conversion

Most conversion tools need configuration for:

  • Base URL for the API

  • Authentication method

  • How to handle references

  • Schema transformation rules

For example, with OpenAPI MCP Generator:

openapi-mcp-generator --input api.yaml --output mcp-server --base-url

4. Generate and test the MCP server

After generating the MCP server, test it with both direct requests and through an AI agent. Verify that:

  • All expected tools are available

  • Tool schemas correctly represent the API operations

  • Authentication works properly

  • Responses match the original API

Testing with an actual AI agent helps identify issues that might not be apparent from inspecting the schemas directly.

Securing your MCP server

Security considerations when converting OpenAPI to MCP include:

  • Authentication mapping: Ensure OpenAPI security schemes map correctly to MCP

  • Authorization scopes: Preserve any OAuth scopes or permission requirements

  • Rate limiting: Implement limits to prevent abuse through AI agents

  • Sensitive data: Be careful about exposing sensitive operations to AI models

Most conversion tools automatically map OpenAPI security schemes to MCP security requirements. For example, an API key in OpenAPI becomes a header parameter in the MCP tool's request template.

Maintaining synchronization

APIs evolve over time, so keeping your MCP server in sync with the OpenAPI spec is important. Options include:

  • CI/CD integration: Regenerate the MCP server whenever the OpenAPI spec changes

  • Version tracking: Maintain separate MCP servers for different API versions

  • Change detection: Implement logic to detect and handle breaking changes

At Stainless, we've integrated MCP server generation into our SDK pipeline, ensuring that when an API changes, both the SDKs and MCP server update automatically.

OpenAPI spec to MCP server next steps

Converting OpenAPI specs to MCP servers bridges the gap between traditional APIs and AI agents. While the process involves some challenges—particularly around schema compatibility, reference resolution, and handling large APIs—the tools and techniques described here make it increasingly straightforward.

As MCP adoption grows, we expect to see more standardization in how tools handle these conversions, making it even easier to expose existing APIs to AI agents.

Ready to streamline your API-to-MCP conversion process? Stainless offers automated generation and management of MCP servers from OpenAPI specs, alongside our SDK generation platform.

FAQs about converting OpenAPI specs to MCP servers

How do I handle authentication when converting from OpenAPI to MCP?

Authentication methods from OpenAPI (like API keys or OAuth) are mapped to equivalent mechanisms in MCP, typically through request templates that include the necessary headers or parameters for each tool call.

Can I convert only specific endpoints from my OpenAPI specification to MCP?

Yes, most conversion tools allow selecting specific paths or operations from your OpenAPI specification, which is useful for large APIs or when you only want to expose certain functionality to AI agents.

How do I test my MCP server after converting from OpenAPI?

Test your MCP server by connecting it to an AI agent that supports MCP (like Claude or GPT), then asking the agent to perform tasks that require using the tools. You can also test directly by sending requests to the MCP server's endpoints.

What happens if my OpenAPI specification changes after I've created an MCP server?

When your OpenAPI specification changes, you'll need to regenerate your MCP server to reflect those changes. Many teams automate this process through CI/CD pipelines that detect changes and trigger regeneration.

Model Context Protocol from OpenAPI Specs

Model Context Protocol (MCP) is a specification that defines how AI models interact with external tools. It standardizes how tools are described and executed in a machine-readable format. At runtime, an AI agent uses MCP to discover available tools and how to call them.

OpenAPI and MCP both use JSON Schema to describe data structures, but they serve different purposes:

  • OpenAPI: Describes REST APIs for documentation, SDKs, and testing

  • MCP: Makes endpoints callable by AI models in a tool-based format

Converting an OpenAPI spec to MCP means transforming REST endpoints into "tools" that AI models can discover and use. Each tool maps to an OpenAPI operation, with inputs and outputs described in a self-contained schema.

Here's a simple example of how an OpenAPI endpoint transforms into an MCP tool:

# OpenAPI Path
/weather/{city}:
  get:
    summary: Get weather for a city
    parameters:
      - name: city
        in: path
        required: true
        schema:
          type: string

# Becomes MCP Tool
name: getWeatherForCity
description: Get weather for a city
args:
  - name: city
    type: string
    required: true

OpenAPI endpoints don't perfectly map to tools

While both MCP tools and OpenAPI specs use JSON Schema, you can't just copy them over as-is. You have to combine the request body, path parameters, query parameters, and header parameters all into one schema, and handle any naming collisions automatically.

For example, an OpenAPI endpoint might have:

  • A path parameter called id

  • A query parameter also called id

  • A body parameter with an id field

In MCP, these become a single input schema, requiring parameter renaming to avoid conflicts. Most conversion tools handle this by adding prefixes or suffixes to create unique parameter names.

Another challenge is that OpenAPI operations often include multiple response types based on status codes. MCP tools typically map to the success response (200-level status codes), leaving error handling to the server implementation.

Handle $refs and recursive references

OpenAPI schemas use $ref to point to chunks of reusable schema elsewhere in the file. However, MCP Tool schemas must be completely self-contained, meaning they cannot reference anything outside themselves.

This requires "dereferencing" - a process of resolving all references and inlining them into a single schema. The challenge comes with recursive references, where Schema A references Schema B, which references Schema A again.

For example, a comment schema that allows nested replies:

Comment:
  type: object
  properties:
    text:
      type: string
    replies:
      type: array
      items:
        $ref: '#/components/schemas/Comment'  # Recursive!

Solutions include:

  • Depth limiting: Stop resolving after a certain number of levels

  • Reference replacement: Replace recursive references with simplified versions

  • Schema flattening: Create non-recursive equivalents that preserve the essential structure

Most openapi to mcp mapping tools implement one of these strategies automatically.

Many APIs have too many endpoints to be imported at once

MCP currently works by loading all of the tool schemas for an MCP Server at once into its context. The LLM then chooses which tool to use based on your request.

This approach works well for APIs with a reasonable number of endpoints. However, many enterprise APIs have hundreds of endpoints, which creates several problems:

  • Context limits: LLMs have fixed context windows that can't hold all tool definitions

  • Selection complexity: Too many similar tools make it harder for the AI to choose correctly

  • Performance impact: Loading large schemas slows down response times

For example, an e-commerce API might have 50+ endpoints just for product management, and hundreds more for orders, customers, and inventory.

Handling large APIs dynamically

Selecting each tool you want to expose can be fine if you have a targeted task, but you might not know exactly which tools you want out of hundreds in a large API.

When converting large OpenAPI specs to MCP, consider these approaches:

  • Functional grouping: Convert only logically related endpoints (e.g., all user management endpoints)

  • On-demand loading: Implement a system that loads tool definitions as needed rather than all at once

  • Tool aggregation: Combine multiple fine-grained endpoints into fewer, more powerful tools

For example, instead of separate tools for getUser, updateUser, and deleteUser, you might create a single manageUser tool that handles all user operations based on an action parameter.

These openapi mcp large api strategies help manage complexity while still making the API accessible to AI agents.

MCP clients have different schema limitations

One of the harder issues was dealing with different clients. Claude Desktop unsurprisingly handles the MCP protocol pretty well, but it turns out that JSON Schemas are interpreted differently between various AI models and MCP clients today, and have different limitations.

Some common compatibility issues include:

  • Schema size limits: Some clients reject schemas above certain sizes

  • Feature support: Not all clients support advanced JSON Schema features like oneOf or allOf

  • Validation strictness: Some clients strictly validate schemas while others are more forgiving

  • Description handling: Clients vary in how they use schema descriptions to guide tool selection

For maximum compatibility, it's best to:

  • Keep schemas as simple as possible

  • Avoid advanced JSON Schema features when possible

  • Test with multiple MCP clients before deployment

Step-by-step conversion process

Converting an OpenAPI spec to an MCP server involves several discrete steps:

1. Prepare your OpenAPI specification

Start with a valid, well-documented OpenAPI specification. Ensure it includes clear descriptions for operations and parameters, as these help AI models understand when and how to use each tool.

For best results:

  • Add detailed descriptions to operations and parameters

  • Use consistent naming patterns

  • Resolve any validation errors in the OpenAPI spec

2. Choose a conversion tool

Several tools can convert OpenAPI specs to MCP servers:

Each tool has different features and output formats, so choose one that matches your requirements and technology stack.

3. Configure the conversion

Most conversion tools need configuration for:

  • Base URL for the API

  • Authentication method

  • How to handle references

  • Schema transformation rules

For example, with OpenAPI MCP Generator:

openapi-mcp-generator --input api.yaml --output mcp-server --base-url

4. Generate and test the MCP server

After generating the MCP server, test it with both direct requests and through an AI agent. Verify that:

  • All expected tools are available

  • Tool schemas correctly represent the API operations

  • Authentication works properly

  • Responses match the original API

Testing with an actual AI agent helps identify issues that might not be apparent from inspecting the schemas directly.

Securing your MCP server

Security considerations when converting OpenAPI to MCP include:

  • Authentication mapping: Ensure OpenAPI security schemes map correctly to MCP

  • Authorization scopes: Preserve any OAuth scopes or permission requirements

  • Rate limiting: Implement limits to prevent abuse through AI agents

  • Sensitive data: Be careful about exposing sensitive operations to AI models

Most conversion tools automatically map OpenAPI security schemes to MCP security requirements. For example, an API key in OpenAPI becomes a header parameter in the MCP tool's request template.

Maintaining synchronization

APIs evolve over time, so keeping your MCP server in sync with the OpenAPI spec is important. Options include:

  • CI/CD integration: Regenerate the MCP server whenever the OpenAPI spec changes

  • Version tracking: Maintain separate MCP servers for different API versions

  • Change detection: Implement logic to detect and handle breaking changes

At Stainless, we've integrated MCP server generation into our SDK pipeline, ensuring that when an API changes, both the SDKs and MCP server update automatically.

OpenAPI spec to MCP server next steps

Converting OpenAPI specs to MCP servers bridges the gap between traditional APIs and AI agents. While the process involves some challenges—particularly around schema compatibility, reference resolution, and handling large APIs—the tools and techniques described here make it increasingly straightforward.

As MCP adoption grows, we expect to see more standardization in how tools handle these conversions, making it even easier to expose existing APIs to AI agents.

Ready to streamline your API-to-MCP conversion process? Stainless offers automated generation and management of MCP servers from OpenAPI specs, alongside our SDK generation platform.

FAQs about converting OpenAPI specs to MCP servers

How do I handle authentication when converting from OpenAPI to MCP?

Authentication methods from OpenAPI (like API keys or OAuth) are mapped to equivalent mechanisms in MCP, typically through request templates that include the necessary headers or parameters for each tool call.

Can I convert only specific endpoints from my OpenAPI specification to MCP?

Yes, most conversion tools allow selecting specific paths or operations from your OpenAPI specification, which is useful for large APIs or when you only want to expose certain functionality to AI agents.

How do I test my MCP server after converting from OpenAPI?

Test your MCP server by connecting it to an AI agent that supports MCP (like Claude or GPT), then asking the agent to perform tasks that require using the tools. You can also test directly by sending requests to the MCP server's endpoints.

What happens if my OpenAPI specification changes after I've created an MCP server?

When your OpenAPI specification changes, you'll need to regenerate your MCP server to reflect those changes. Many teams automate this process through CI/CD pipelines that detect changes and trigger regeneration.

Model Context Protocol from OpenAPI Specs

Model Context Protocol (MCP) is a specification that defines how AI models interact with external tools. It standardizes how tools are described and executed in a machine-readable format. At runtime, an AI agent uses MCP to discover available tools and how to call them.

OpenAPI and MCP both use JSON Schema to describe data structures, but they serve different purposes:

  • OpenAPI: Describes REST APIs for documentation, SDKs, and testing

  • MCP: Makes endpoints callable by AI models in a tool-based format

Converting an OpenAPI spec to MCP means transforming REST endpoints into "tools" that AI models can discover and use. Each tool maps to an OpenAPI operation, with inputs and outputs described in a self-contained schema.

Here's a simple example of how an OpenAPI endpoint transforms into an MCP tool:

# OpenAPI Path
/weather/{city}:
  get:
    summary: Get weather for a city
    parameters:
      - name: city
        in: path
        required: true
        schema:
          type: string

# Becomes MCP Tool
name: getWeatherForCity
description: Get weather for a city
args:
  - name: city
    type: string
    required: true

OpenAPI endpoints don't perfectly map to tools

While both MCP tools and OpenAPI specs use JSON Schema, you can't just copy them over as-is. You have to combine the request body, path parameters, query parameters, and header parameters all into one schema, and handle any naming collisions automatically.

For example, an OpenAPI endpoint might have:

  • A path parameter called id

  • A query parameter also called id

  • A body parameter with an id field

In MCP, these become a single input schema, requiring parameter renaming to avoid conflicts. Most conversion tools handle this by adding prefixes or suffixes to create unique parameter names.

Another challenge is that OpenAPI operations often include multiple response types based on status codes. MCP tools typically map to the success response (200-level status codes), leaving error handling to the server implementation.

Handle $refs and recursive references

OpenAPI schemas use $ref to point to chunks of reusable schema elsewhere in the file. However, MCP Tool schemas must be completely self-contained, meaning they cannot reference anything outside themselves.

This requires "dereferencing" - a process of resolving all references and inlining them into a single schema. The challenge comes with recursive references, where Schema A references Schema B, which references Schema A again.

For example, a comment schema that allows nested replies:

Comment:
  type: object
  properties:
    text:
      type: string
    replies:
      type: array
      items:
        $ref: '#/components/schemas/Comment'  # Recursive!

Solutions include:

  • Depth limiting: Stop resolving after a certain number of levels

  • Reference replacement: Replace recursive references with simplified versions

  • Schema flattening: Create non-recursive equivalents that preserve the essential structure

Most openapi to mcp mapping tools implement one of these strategies automatically.

Many APIs have too many endpoints to be imported at once

MCP currently works by loading all of the tool schemas for an MCP Server at once into its context. The LLM then chooses which tool to use based on your request.

This approach works well for APIs with a reasonable number of endpoints. However, many enterprise APIs have hundreds of endpoints, which creates several problems:

  • Context limits: LLMs have fixed context windows that can't hold all tool definitions

  • Selection complexity: Too many similar tools make it harder for the AI to choose correctly

  • Performance impact: Loading large schemas slows down response times

For example, an e-commerce API might have 50+ endpoints just for product management, and hundreds more for orders, customers, and inventory.

Handling large APIs dynamically

Selecting each tool you want to expose can be fine if you have a targeted task, but you might not know exactly which tools you want out of hundreds in a large API.

When converting large OpenAPI specs to MCP, consider these approaches:

  • Functional grouping: Convert only logically related endpoints (e.g., all user management endpoints)

  • On-demand loading: Implement a system that loads tool definitions as needed rather than all at once

  • Tool aggregation: Combine multiple fine-grained endpoints into fewer, more powerful tools

For example, instead of separate tools for getUser, updateUser, and deleteUser, you might create a single manageUser tool that handles all user operations based on an action parameter.

These openapi mcp large api strategies help manage complexity while still making the API accessible to AI agents.

MCP clients have different schema limitations

One of the harder issues was dealing with different clients. Claude Desktop unsurprisingly handles the MCP protocol pretty well, but it turns out that JSON Schemas are interpreted differently between various AI models and MCP clients today, and have different limitations.

Some common compatibility issues include:

  • Schema size limits: Some clients reject schemas above certain sizes

  • Feature support: Not all clients support advanced JSON Schema features like oneOf or allOf

  • Validation strictness: Some clients strictly validate schemas while others are more forgiving

  • Description handling: Clients vary in how they use schema descriptions to guide tool selection

For maximum compatibility, it's best to:

  • Keep schemas as simple as possible

  • Avoid advanced JSON Schema features when possible

  • Test with multiple MCP clients before deployment

Step-by-step conversion process

Converting an OpenAPI spec to an MCP server involves several discrete steps:

1. Prepare your OpenAPI specification

Start with a valid, well-documented OpenAPI specification. Ensure it includes clear descriptions for operations and parameters, as these help AI models understand when and how to use each tool.

For best results:

  • Add detailed descriptions to operations and parameters

  • Use consistent naming patterns

  • Resolve any validation errors in the OpenAPI spec

2. Choose a conversion tool

Several tools can convert OpenAPI specs to MCP servers:

Each tool has different features and output formats, so choose one that matches your requirements and technology stack.

3. Configure the conversion

Most conversion tools need configuration for:

  • Base URL for the API

  • Authentication method

  • How to handle references

  • Schema transformation rules

For example, with OpenAPI MCP Generator:

openapi-mcp-generator --input api.yaml --output mcp-server --base-url

4. Generate and test the MCP server

After generating the MCP server, test it with both direct requests and through an AI agent. Verify that:

  • All expected tools are available

  • Tool schemas correctly represent the API operations

  • Authentication works properly

  • Responses match the original API

Testing with an actual AI agent helps identify issues that might not be apparent from inspecting the schemas directly.

Securing your MCP server

Security considerations when converting OpenAPI to MCP include:

  • Authentication mapping: Ensure OpenAPI security schemes map correctly to MCP

  • Authorization scopes: Preserve any OAuth scopes or permission requirements

  • Rate limiting: Implement limits to prevent abuse through AI agents

  • Sensitive data: Be careful about exposing sensitive operations to AI models

Most conversion tools automatically map OpenAPI security schemes to MCP security requirements. For example, an API key in OpenAPI becomes a header parameter in the MCP tool's request template.

Maintaining synchronization

APIs evolve over time, so keeping your MCP server in sync with the OpenAPI spec is important. Options include:

  • CI/CD integration: Regenerate the MCP server whenever the OpenAPI spec changes

  • Version tracking: Maintain separate MCP servers for different API versions

  • Change detection: Implement logic to detect and handle breaking changes

At Stainless, we've integrated MCP server generation into our SDK pipeline, ensuring that when an API changes, both the SDKs and MCP server update automatically.

OpenAPI spec to MCP server next steps

Converting OpenAPI specs to MCP servers bridges the gap between traditional APIs and AI agents. While the process involves some challenges—particularly around schema compatibility, reference resolution, and handling large APIs—the tools and techniques described here make it increasingly straightforward.

As MCP adoption grows, we expect to see more standardization in how tools handle these conversions, making it even easier to expose existing APIs to AI agents.

Ready to streamline your API-to-MCP conversion process? Stainless offers automated generation and management of MCP servers from OpenAPI specs, alongside our SDK generation platform.

FAQs about converting OpenAPI specs to MCP servers

How do I handle authentication when converting from OpenAPI to MCP?

Authentication methods from OpenAPI (like API keys or OAuth) are mapped to equivalent mechanisms in MCP, typically through request templates that include the necessary headers or parameters for each tool call.

Can I convert only specific endpoints from my OpenAPI specification to MCP?

Yes, most conversion tools allow selecting specific paths or operations from your OpenAPI specification, which is useful for large APIs or when you only want to expose certain functionality to AI agents.

How do I test my MCP server after converting from OpenAPI?

Test your MCP server by connecting it to an AI agent that supports MCP (like Claude or GPT), then asking the agent to perform tasks that require using the tools. You can also test directly by sending requests to the MCP server's endpoints.

What happens if my OpenAPI specification changes after I've created an MCP server?

When your OpenAPI specification changes, you'll need to regenerate your MCP server to reflect those changes. Many teams automate this process through CI/CD pipelines that detect changes and trigger regeneration.

Featured MCP Resources

Essential events, guides and insights to help you master MCP server development.

Featured MCP Resources

Essential events, guides and insights to help you master MCP server development.

Featured MCP Resources

Essential events, guides and insights to help you master MCP server development.