π€ Modern Async Python Framework for AI APIs with Native MCP Support
nzrApi is a powerful, production-ready Python framework specifically designed for building AI-powered APIs. It combines the best of modern web frameworks with specialized features for AI model integration, making it the perfect choice for developers who want to build scalable AI services with minimal complexity.
- π€ Native AI Model Integration - First-class support for multiple AI providers and custom models
- π Model Context Protocol (MCP) - Built-in MCP implementation for seamless n8n integration
- β‘ High Performance - Async/await throughout with ASGI compliance
- π Context Management - Persistent conversation contexts with automatic cleanup
- π‘οΈ Production Ready - Rate limiting, authentication, monitoring, and error handling
- ποΈ Database Integration - SQLAlchemy async with automatic migrations
- π¨ DRF-Inspired Serializers - Familiar, powerful data validation and transformation
- π Auto-Generation - CLI tools for rapid project scaffolding
- π³ Cloud Native - Docker support with production configurations
pip install nzrapiCreate main.py:
from nzrapi import NzrApiApp
app = NzrApiApp() # Zero config! Just works β¨
@app.get("/")
async def hello():
return {"message": "Hello, World!"}
@app.get("/greet/{name}")
async def greet(name: str):
return {"message": f"Hello, {name}!"}
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, port=8000)Run it:
python main.py
# Visit http://localhost:8000/docs for interactive API documentationWhat you get for free in v1.0.2:
- β Zero Configuration - No boilerplate required
- β Automatic CORS - Enabled by default in debug mode
- β Error Handling - Production-ready error middleware
- β Request Logging - Track all API calls automatically
- β
OpenAPI Docs - Interactive documentation at
/docs - β Type Validation - Automatic with Python type hints
- β Auto Response Wrapping - Dict/list automatically converted to JSON
- β Optional Database - No overhead unless you configure one
# Create a new minimal project (single file)
nzrapi new my-api --template minimal
# Or create a full MCP-enabled project
nzrapi new my-ai-api --template mcp_server
# Navigate to project
cd my-api
# Run the development server
python main.pyYour API is now running at http://localhost:8000! π
nzrApi makes it incredibly easy to work with AI models:
from nzrapi.ai.models import AIModel
class MyCustomModel(AIModel):
async def load_model(self):
# Load your model (PyTorch, HuggingFace, OpenAI, etc.)
self.model = load_my_model()
self.is_loaded = True
async def predict(self, payload, context=None):
# Make predictions with optional context
result = self.model.generate(payload["prompt"])
return {"response": result}
# Register and use
app.ai_registry.register_model_class("custom", MyCustomModel)
await app.ai_registry.add_model("my_model", "custom", config={...})- β OpenAI (GPT-3.5, GPT-4, etc.)
- β Anthropic (Claude models)
- β HuggingFace (Transformers, Inference API)
- β Custom Models (PyTorch, TensorFlow, etc.)
- β Mock Models (for development and testing)
nzrApi implements the Model Context Protocol for stateful AI interactions:
# MCP-compliant endpoint
@router.post("/mcp/{model_name}/predict")
async def mcp_predict(request, model_name: str):
# Automatic context management
mcp_request = MCPRequest(**(await request.json()))
# Retrieve conversation context
context = await get_context(mcp_request.context_id)
# Make prediction with context
model = request.app.ai_registry.get_model(model_name)
result = await model.predict(mcp_request.payload, context)
# Return MCP-compliant response
return MCPResponse(
request_id=mcp_request.request_id,
context_id=mcp_request.context_id,
result=result
)nzrApi provides robust data validation:
from nzrapi.serializers import BaseSerializer, CharField, IntegerField
class ChatRequestSerializer(BaseSerializer):
message = CharField(max_length=1000)
user_id = CharField(required=False)
temperature = FloatField(min_value=0.0, max_value=2.0, default=0.7)
def validate(self, data):
# Custom validation logic
return data
# Use in endpoints
@router.post("/chat")
async def chat(request):
data = await request.json()
serializer = ChatRequestSerializer(data=data)
if serializer.is_valid():
validated_data = serializer.validated_data
# Process with confidence...
else:
return JSONResponse(serializer.errors, status_code=422)Built-in async database support with SQLAlchemy:
from nzrapi.db import Base
from sqlalchemy import Column, Integer, String, DateTime
class ConversationHistory(Base):
__tablename__ = "conversations"
id = Column(Integer, primary_key=True)
user_id = Column(String(255), index=True)
message = Column(Text)
response = Column(Text)
created_at = Column(DateTime, default=datetime.utcnow)
# Use in endpoints
@router.post("/chat")
async def chat(request):
async with request.app.get_db_session() as session:
# Save conversation
conversation = ConversationHistory(
user_id=user_id,
message=message,
response=response
)
session.add(conversation)
await session.commit()Zero configuration needed! CORS, error handling, and logging are enabled by default:
from nzrapi import NzrApiApp
# Development - allows all origins automatically
app = NzrApiApp(debug=True)
# Production - specify allowed origins
app = NzrApiApp(
cors_origins=["https://app.n8n.cloud", "https://yourdomain.com"]
)from nzrapi.middleware import RateLimitMiddleware
app.add_middleware(
RateLimitMiddleware,
calls_per_minute=60,
calls_per_hour=1000
)Simplified password hashing (v1.0.2+):
from nzrapi import create_password_hash, check_password_hash
# Hash password (salt is automatic!)
password_hash = create_password_hash("user_password")
# Verify password
is_valid = check_password_hash("user_password", password_hash)JWT Authentication:
from nzrapi import JWTBearer, create_access_token, Depends
jwt_auth = JWTBearer(secret_key="your-secret", algorithm="HS256")
@router.get("/protected")
async def protected_route(payload: dict = Depends(jwt_auth)):
return {"user": payload.get("sub"), "message": "Access granted"}nzrApi includes powerful CLI tools for development:
# Create new project (v1.0.2+ supports minimal template)
nzrapi new my-project --template minimal # Single-file template
nzrapi new my-project --template mcp_server # Full MCP server
nzrapi new my-project --template api_server # Standard API server
# Run development server
nzrapi run --reload --port 8000
# Database migrations
nzrapi migrate -m "Add user table"
nzrapi migrate --upgrade
# Model management
nzrapi models --list
nzrapi models --add openai_gpt4 --type openai
# Project info
nzrapi infoPerfect for n8n workflows with built-in MCP support:
{
"nodes": [{
"name": "AI Chat",
"type": "n8n-nodes-base.httpRequest",
"parameters": {
"url": "http://your-api.com/api/v1/mcp/gpt4/predict",
"method": "POST",
"body": {
"context_id": "{{ $json.session_id }}",
"payload": {
"message": "{{ $json.user_input }}"
}
}
}
}]
}Built-in monitoring capabilities:
# Health checks
GET /health
GET /api/v1/models/{name}/health
# Metrics
GET /metrics
GET /api/v1/stats
# Usage analytics
GET /api/v1/usage/models
GET /api/v1/conversations/{context_id}Production-ready containers:
FROM python:3.11-slim
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
EXPOSE 8000
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]# Build and run
docker build -t my-ai-api .
docker run -p 8000:8000 my-ai-api
# Or use docker-compose
docker-compose up -dCheck out our comprehensive examples (all updated for v1.0.2):
- Hello World - Minimal 10-line API (new in v1.0.2!)
- Basic API - Simple AI API with chat functionality
- Typed API - Advanced type validation and serialization
- Security Demo - All authentication methods (JWT, API keys, OAuth2)
- WebSocket Chat - Real-time WebSocket connections
- PostgreSQL API - Full database integration with async SQLAlchemy
- AI Chatbot - Full-featured chatbot with personality
- Dependency Injection - Advanced DI patterns
- n8n Integration - Complete n8n workflow examples
- Quick Start Guide
- API Reference
- AI Model Integration
- MCP Specification
- Deployment Guide
- Migration Guide (v1.0.1 β v1.0.2) - Upgrade instructions
- Changelog - Version history and release notes
- Framework Improvements - Detailed technical improvements
We welcome contributions! Please see our Contributing Guide for details.
# Development setup
git clone https://github.com/nzrapi/nzrapi.git
cd nzrapi
pip install -e ".[dev]"
# Run tests
pytest
# Run linting
black .
isort .
flake8This project is licensed under the MIT License - see the LICENSE file for details.
- Built on the excellent Starlette foundations
- Designed for seamless n8n integration
- Community-driven development
- Homepage: https://nzrapi.dev
- Documentation: https://nzrapi.readthedocs.io
- PyPI: https://pypi.org/project/nzrapi/
- GitHub: https://github.com/nzrapi/nzrapi
- Discord: https://discord.gg/nzrapi
Zero-Config Framework - 85% Faster Setup!
- β¨ Automatic Middleware - CORS, error handling, and logging enabled by default
- β¨ Optional Request Parameter - Functions without
requestusage don't need to declare it - β¨ Auto Response Wrapping - Dict/list returns automatically wrapped in JSONResponse
- β¨ Simplified Security -
create_password_hash()/check_password_hash()(no salt needed!) - β¨ Smart Database - Optional database_url (no overhead when unused)
- β¨ Minimal Template - New single-file template for rapid prototyping
- β¨ Hello World Example - 10-line working API in the examples directory
See CHANGELOG.md for full release notes and MIGRATION_GUIDE.md for upgrade instructions.
Built with β€οΈ for the AI community
nzrApi Framework - Making AI APIs Simple and Powerful