Have you ever come across a situation where you wanted your chatbot to use a tool and then answer—as if it could think, act, and then speak like a human assistant? Sounds complicated, right?
Well, not anymore.
With MCP (Model Context Protocol), you can seamlessly connect your Large Language Model (LLM) to external tools, APIs, and services. This gives your chatbot or intelligent agent the ability to act autonomously—whether it's querying a database, calling a weather API, or triggering a workflow—before generating a response. And the best part? You can transform any FastAPI app into a fully operational MCP server with minimal effort using FastAPI-MCP.
In this tutorial, we'll guide you through how to take a basic FastAPI web app and supercharge it into an MCP-compatible tool server that your LLM can interact with. Whether you're building AI agents, tool-using bots, or interactive APIs, FastAPI-MCP makes integration seamless.
🚀 What is FastAPI-MCP?
FastAPI-MCP is a lightweight Python library that turns any FastAPI app into a server that speaks the Model Context Protocol—an emerging standard that allows language models to interact with external tools and environments in a safe, structured, and context-aware manner.
MCP was introduced by Anysphere (creators of Cursor) to let language models "think before speaking" by interacting with registered tools. Think of it as a standardized layer between your LLM and the real world.
With fastapi-mcp
, you can:
-
Wrap your existing FastAPI endpoints into MCP-compatible tools
-
Enable introspection so LLMs can understand how to call your endpoints
-
Provide authentication, descriptions, and schemas that make it easy for LLMs to use
🌐 What APIs Can Be Converted into MCP Using FastAPI-MCP?
In principle, any API built with FastAPI can be converted into an MCP tool. This includes:
-
REST endpoints (GET/POST/PUT/etc.)
-
Microservices (e.g., user auth, database lookup, weather API)
-
External service wrappers (e.g., Stripe payments, GitHub integrations)
-
Internal dev tools (e.g., code analysis, CI/CD triggers)
-
Utility functions (e.g., calculator, document parser)
If your endpoint returns JSON and is structured with FastAPI’s type hints, it can likely be MCP-wrapped with minimal changes.
🛠️ Hands-on: Using FastAPI-MCP
Let’s walk through how to turn a simple FastAPI app into an MCP server.
Step 1: Install the Library
You’ll also need FastAPI and a server like Uvicorn if you haven’t installed them yet:
Step 2: Create a Simple FastAPI App
Here's a basic app with a /weather
endpoint:
Step 3: Add MCP Integration
Wrap your app with FastAPI-MCP using the mcpify
function:
Now, your app is MCP-enabled! The LLM will be able to discover, understand, and invoke the /weather
endpoint as a tool.
🧑💻 Configuring FastAPI
For best results, make sure your FastAPI endpoints are:
-
Typed using Pydantic models
-
Properly documented with
description
andsummary
(used by LLMs for context) -
Named clearly (endpoint names should match their function)
Also, use POST requests for most tool interfaces, as they can carry structured inputs.
🧠 Configuring Cursor to Use MCP
If you're using Cursor, the AI-powered coding editor by Anysphere, you can point it to your local MCP server:
-
Launch your app:
-
In Cursor, add your MCP tool via the agent configuration settings. Enter your tool’s URL (e.g.,
http://localhost:8000/.well-known/ai-plugin.json
). -
Your LLM agent can now see and use the tool you've built—just like a plugin!
✅ Testing the MCP Server
To verify your tool is discoverable by an LLM, visit:
This file is automatically generated by FastAPI-MCP and provides metadata that LLMs read to understand your server’s capabilities.
You can also test your endpoint directly using curl
, Postman, or a simple Python client:
🏁
With fastapi-mcp
, you're just a few lines of code away from giving your FastAPI app superpowers. Whether you're building developer tools, AI agents, or chatbot extensions, MCP opens up a new frontier in tool-augmented intelligence.
By enabling structured, callable tools, you're letting your LLM act before it speaks—and that’s the future of AI interaction.
Ready to build your first LLM tool? Try converting one of your existing APIs with FastAPI-MCP and see the magic unfold.