Sep 26, 2025

CJ Quines
MCP servers generated by Stainless now includes a docs search tool that lets LLMs query up-to-date documentation for your Stainless-generated SDKs. The documentation is served in Markdown optimized for model consumption, based on your latest OpenAPI spec and Stainless Config.
See docs for more details.
Sep 26, 2025

CJ Quines
MCP servers generated by Stainless now includes a docs search tool that lets LLMs query up-to-date documentation for your Stainless-generated SDKs. The documentation is served in Markdown optimized for model consumption, based on your latest OpenAPI spec and Stainless Config.
See docs for more details.
Sep 26, 2025

CJ Quines
MCP servers generated by Stainless now includes a docs search tool that lets LLMs query up-to-date documentation for your Stainless-generated SDKs. The documentation is served in Markdown optimized for model consumption, based on your latest OpenAPI spec and Stainless Config.
See docs for more details.
Jul 14, 2025

David Ackerman
Previously, large API responses could be a big problem for MCP servers - it would eat up LLM context windows with lots of irrelevant data, when a small subset may be sufficient to answer the user’s query. Manually excluding some response data in the MCP server can alleviate that problem, but then it makes some data inaccessible by the LLM.
To solve this problem, our MCP servers now provide an additional jq_filter
parameter on each tool call, which allows the LLM to automatically use jq syntax to transform the API response based on the response schema.
This means that the LLM can dynamically select only the information it wants to answer the user’s query, while at the same time not restricting the data that is possible to access via the MCP server.
This functionality is automatically available in all MCP servers now.
Jul 14, 2025

David Ackerman
Previously, large API responses could be a big problem for MCP servers - it would eat up LLM context windows with lots of irrelevant data, when a small subset may be sufficient to answer the user’s query. Manually excluding some response data in the MCP server can alleviate that problem, but then it makes some data inaccessible by the LLM.
To solve this problem, our MCP servers now provide an additional jq_filter
parameter on each tool call, which allows the LLM to automatically use jq syntax to transform the API response based on the response schema.
This means that the LLM can dynamically select only the information it wants to answer the user’s query, while at the same time not restricting the data that is possible to access via the MCP server.
This functionality is automatically available in all MCP servers now.
Jul 14, 2025

David Ackerman
Previously, large API responses could be a big problem for MCP servers - it would eat up LLM context windows with lots of irrelevant data, when a small subset may be sufficient to answer the user’s query. Manually excluding some response data in the MCP server can alleviate that problem, but then it makes some data inaccessible by the LLM.
To solve this problem, our MCP servers now provide an additional jq_filter
parameter on each tool call, which allows the LLM to automatically use jq syntax to transform the API response based on the response schema.
This means that the LLM can dynamically select only the information it wants to answer the user’s query, while at the same time not restricting the data that is possible to access via the MCP server.
This functionality is automatically available in all MCP servers now.
Jun 13, 2025

David Ackerman
You can now deploy your MCP servers as Cloudflare Workers, making them accessible to web-based AI interfaces like claude.ai.
When you enable generate_cloudflare_worker: true
in your MCP server configuration, Stainless generates a complete Cloudflare Worker template alongside your MCP server. This worker implements OAuth authentication and serves your API tools remotely, eliminating the need for local installations or API key management in client configs.
The generated worker includes:
OAuth consent screen with customizable form fields (text, password, select dropdowns)
Support for both SSE and Streaming HTTP protocols
One-click deployment via GitHub integration
This opens up MCP servers to a broader audience beyond desktop developers, enabling seamless integration with web-based AI tools while maintaining security through proper OAuth flows.
Take a look at our documentation to learn more about deployment options and customization.
Jun 13, 2025

David Ackerman
You can now deploy your MCP servers as Cloudflare Workers, making them accessible to web-based AI interfaces like claude.ai.
When you enable generate_cloudflare_worker: true
in your MCP server configuration, Stainless generates a complete Cloudflare Worker template alongside your MCP server. This worker implements OAuth authentication and serves your API tools remotely, eliminating the need for local installations or API key management in client configs.
The generated worker includes:
OAuth consent screen with customizable form fields (text, password, select dropdowns)
Support for both SSE and Streaming HTTP protocols
One-click deployment via GitHub integration
This opens up MCP servers to a broader audience beyond desktop developers, enabling seamless integration with web-based AI tools while maintaining security through proper OAuth flows.
Take a look at our documentation to learn more about deployment options and customization.
Jun 13, 2025

David Ackerman
You can now deploy your MCP servers as Cloudflare Workers, making them accessible to web-based AI interfaces like claude.ai.
When you enable generate_cloudflare_worker: true
in your MCP server configuration, Stainless generates a complete Cloudflare Worker template alongside your MCP server. This worker implements OAuth authentication and serves your API tools remotely, eliminating the need for local installations or API key management in client configs.
The generated worker includes:
OAuth consent screen with customizable form fields (text, password, select dropdowns)
Support for both SSE and Streaming HTTP protocols
One-click deployment via GitHub integration
This opens up MCP servers to a broader audience beyond desktop developers, enabling seamless integration with web-based AI tools while maintaining security through proper OAuth flows.
Take a look at our documentation to learn more about deployment options and customization.
Jun 6, 2025

Sam El-Borai
We now support building and publishing docker images for your MCP server.
targets: typescript: options: mcp_server: publish: docker: "my-dockerhub-org/my-image-mcp"
See the MCP docs for options and details regarding the publishing process.
Jun 6, 2025

Sam El-Borai
We now support building and publishing docker images for your MCP server.
targets: typescript: options: mcp_server: publish: docker: "my-dockerhub-org/my-image-mcp"
See the MCP docs for options and details regarding the publishing process.
Jun 6, 2025

Sam El-Borai
We now support building and publishing docker images for your MCP server.
targets: typescript: options: mcp_server: publish: docker: "my-dockerhub-org/my-image-mcp"
See the MCP docs for options and details regarding the publishing process.