How to Convert Any FastAPI App into an MCP Server Using FastAPI-MCP

 




Have you ever come across a situation where you wanted your chatbot to use a tool and then answer—as if it could think, act, and then speak like a human assistant? Sounds complicated, right?

Well, not anymore.

With MCP (Model Context Protocol), you can seamlessly connect your Large Language Model (LLM) to external tools, APIs, and services. This gives your chatbot or intelligent agent the ability to act autonomously—whether it's querying a database, calling a weather API, or triggering a workflow—before generating a response. And the best part? You can transform any FastAPI app into a fully operational MCP server with minimal effort using FastAPI-MCP.

In this tutorial, we'll guide you through how to take a basic FastAPI web app and supercharge it into an MCP-compatible tool server that your LLM can interact with. Whether you're building AI agents, tool-using bots, or interactive APIs, FastAPI-MCP makes integration seamless.


🚀 What is FastAPI-MCP?

FastAPI-MCP is a lightweight Python library that turns any FastAPI app into a server that speaks the Model Context Protocol—an emerging standard that allows language models to interact with external tools and environments in a safe, structured, and context-aware manner.

MCP was introduced by Anysphere (creators of Cursor) to let language models "think before speaking" by interacting with registered tools. Think of it as a standardized layer between your LLM and the real world.

With fastapi-mcp, you can:

  • Wrap your existing FastAPI endpoints into MCP-compatible tools

  • Enable introspection so LLMs can understand how to call your endpoints

  • Provide authentication, descriptions, and schemas that make it easy for LLMs to use


🌐 What APIs Can Be Converted into MCP Using FastAPI-MCP?

In principle, any API built with FastAPI can be converted into an MCP tool. This includes:

  • REST endpoints (GET/POST/PUT/etc.)

  • Microservices (e.g., user auth, database lookup, weather API)

  • External service wrappers (e.g., Stripe payments, GitHub integrations)

  • Internal dev tools (e.g., code analysis, CI/CD triggers)

  • Utility functions (e.g., calculator, document parser)

If your endpoint returns JSON and is structured with FastAPI’s type hints, it can likely be MCP-wrapped with minimal changes.


🛠️ Hands-on: Using FastAPI-MCP

Let’s walk through how to turn a simple FastAPI app into an MCP server.

Step 1: Install the Library

bash
pip install fastapi-mcp

You’ll also need FastAPI and a server like Uvicorn if you haven’t installed them yet:

bash
pip install fastapi uvicorn

Step 2: Create a Simple FastAPI App

Here's a basic app with a /weather endpoint:

python
from fastapi import FastAPI from pydantic import BaseModel app = FastAPI() class WeatherRequest(BaseModel): city: str @app.post("/weather") def get_weather(req: WeatherRequest): # Dummy implementation return {"city": req.city, "temperature": "25°C", "status": "Sunny"}

Step 3: Add MCP Integration

Wrap your app with FastAPI-MCP using the mcpify function:

python
from fastapi_mcp import mcpify app = mcpify( app, description="A simple weather API that tells you the current weather for a given city.", authentication="none" # You can specify "optional" or "required" )

Now, your app is MCP-enabled! The LLM will be able to discover, understand, and invoke the /weather endpoint as a tool.


🧑‍💻 Configuring FastAPI

For best results, make sure your FastAPI endpoints are:

  • Typed using Pydantic models

  • Properly documented with description and summary (used by LLMs for context)

  • Named clearly (endpoint names should match their function)

Also, use POST requests for most tool interfaces, as they can carry structured inputs.


🧠 Configuring Cursor to Use MCP

If you're using Cursor, the AI-powered coding editor by Anysphere, you can point it to your local MCP server:

  1. Launch your app:

    bash
    uvicorn app:app --reload
  2. In Cursor, add your MCP tool via the agent configuration settings. Enter your tool’s URL (e.g., http://localhost:8000/.well-known/ai-plugin.json).

  3. Your LLM agent can now see and use the tool you've built—just like a plugin!


✅ Testing the MCP Server

To verify your tool is discoverable by an LLM, visit:

bash
http://localhost:8000/.well-known/ai-plugin.json

This file is automatically generated by FastAPI-MCP and provides metadata that LLMs read to understand your server’s capabilities.

You can also test your endpoint directly using curl, Postman, or a simple Python client:

python
import requests res = requests.post("http://localhost:8000/weather", json={"city": "Berlin"}) print(res.json())

🏁

With fastapi-mcp, you're just a few lines of code away from giving your FastAPI app superpowers. Whether you're building developer tools, AI agents, or chatbot extensions, MCP opens up a new frontier in tool-augmented intelligence.

By enabling structured, callable tools, you're letting your LLM act before it speaks—and that’s the future of AI interaction.

Ready to build your first LLM tool? Try converting one of your existing APIs with FastAPI-MCP and see the magic unfold.

Post a Comment

Previous Post Next Post

By: vijAI Robotics Desk