Members-Only
Recent Talks & Demos are for members only
You must be an AI Tinkerers active member to view these talks and demos.
Token-efficient MCP servers
This talk covers building token-efficient MCP servers to allow LLMs to access large web APIs without exceeding context limits, featuring technical details and live demos.
I’m building an open standard for converting any web API into an MCP server: https://www.open-mcp.org/
Some web APIs have hundreds of endpoints with large request bodies, adding up to millions of tokens. I’m exploring ways to deal with this so we can give LLMs access to large APIs without eating up all the context.
OpenMCP standardizes token-efficient conversion of web APIs to MCP servers.
OpenMCP converts web APIs to token-efficient MCP servers for LLM access.
Related projects
Simplify (cool) AI Integrations with MCP
London
Learn how to build and host a Model Context Protocol hub on Cloudflare Durable Objects, enabling cheap, customizable…
How to make an MCP server
Miami
Learn what MCP is, how to deploy your own MCP server, and see a live demo of a…
Implementing an mcp server for aws
San Francisco
Demonstrating an MCP server that consolidates thousands of AWS APIs into unified tool calls, detailing the mapping structure…
How to Build Your Own MCP Server
Boston
Learn step‑by‑step how to build a Model Context Protocol server, test it within Cursor, and configure Cursor rules…
Optimizing MCP Servers is Weird
Amsterdam
Learn to test and optimize MCP server tools programmatically using eval sets and golden prompts, enabling recursive improvement…
Making MCP Auth Easy
San Francisco
Learn how to configure WorkOS AuthKit for MCP, covering dynamic client registration, consent flow, and enterprise integrations like…