Skip to content

A minimal Model Context Protocol (MCP) HTTP server built with fastmcp compatible with StackAI Call MCP Server action.

License

Notifications You must be signed in to change notification settings

stackai/fast-mcp-http-server

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

fast-mcp-http-server

A minimal Model Context Protocol (MCP) HTTP server built with fastmcp. It exposes a simple tool hello(name: str) -> str and runs an HTTP server on 127.0.0.1:8080.

Requirements

  • Python 3.12+ (see .python-version and pyproject.toml)
  • uv (Python package/dependency manager)
    • Install (macOS/Linux):
      • curl: curl -LsSf https://astral.sh/uv/install.sh | sh
      • Homebrew: brew install uv
    • If using Windows or need another method, just follow the instructions in uv's documentation.
  • ngrok (for exposing the server to the internet)
    • Install (macOS/Linux):
      • Homebrew: brew install ngrok
    • If using any other patform of cannot use Homebrew, download from ngrok.com
  • To use it in StackAI, you will need to have a StackAI account. You can sign up for a free account at StackAI.

Install dependencies (with uv)

  • Install dependencies (creates/uses a virtual env automatically):
    uv sync
    

Run

  • Start the MCP HTTP server:
    uv run main.py
    
  • The server listens on:
    • Host: 127.0.0.1
    • Port: 8080 (you can change this in main.py if you want)

Expose to Web (using ngrok)

To make your MCP HTTP server accessible from the internet, you can use ngrok to create a secure tunnel:

Install ngrok

  • macOS (with Homebrew):
    brew install ngrok
    
  • Other platforms: Download from ngrok.com

Follow the steps in ngrok's documentation to set up ngrok with your account (you will have to create it if you don't have one).

Run with ngrok

  1. Start your MCP server in one terminal:

    uv run main.py
    
  2. Create ngrok tunnel in another terminal (notice you must state the port where your MCP server is running, in this case 8080):

    ngrok http http://localhost:8080
    

    or in case you have a static domain in ngrok:

    ngrok http --url=your-ngrok-domain.ngrok-free.app 8080
    
  3. Check you can reach your MCP server by accessing the /mcp endpoint of the URL provided by ngrok (e.g., http://your-ngrok-domain.ngrok-free.app/mcp).

    Don't worry if you find an error with "message": "Not Acceptable: Client must accept text/event-stream", it's expected if using a browser instead of an MCP client.

Security Notes

  • Free tier limitations: ngrok free tier URLs change on restart
  • Authentication: Consider adding authentication to your MCP server for production use
  • HTTPS: ngrok provides HTTPS automatically for secure connections

Use at StackAI

If you want to use this server in a StackAI workflow, follow these steps:

  1. Add a Call MCP Server action to your StackAI workflow.
  2. Set up the connection to your MCP server. You can just create a connection in the settings of the action and use the ngrok URL provided by ngrok (e.g., http://your-ngrok-domain.ngrok-free.app/mcp - don't forget to add the /mcp endpoint). Add the name you prefer for the connection.
  3. Run your workflow. This action should list the available tools if you do nothing else.

You can play with the action to use specific tools by choosing one in the Configurations section of the action settings sidebar.

Project Structure

  • main.py: Defines the FastMCP server, the hello tool, and starts the HTTP server.
  • pyproject.toml: Project metadata and dependencies (fastmcp).
  • uv.lock: Locked dependency set for reproducible installs.
  • .python-version: Target Python version (3.12).

Notes

  • Example tool:
    @mcp.tool
    def hello(name: str) -> str:
        return f"Hello, {name}!"
  • Connect from an MCP-compatible client using HTTP transport at http://127.0.0.1:8080/.

Additional Resources

About

A minimal Model Context Protocol (MCP) HTTP server built with fastmcp compatible with StackAI Call MCP Server action.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages