Skip to content

**nzrApi** is a powerful, production-ready Python framework specifically designed for building AI-powered APIs. It combines the best of modern web frameworks with specialized features for AI model integration, making it the perfect choice for developers who want to build scalable AI services with minimal complexity.

License

Notifications You must be signed in to change notification settings

suportly/nzrapi_framework

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

20 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

nzrApi Framework

πŸ€– Modern Async Python Framework for AI APIs with Native MCP Support

PyPI version Python Support License: MIT Build Status Coverage

Documentation | Examples | n8n Integration | Contributing


✨ What is nzrApi?

nzrApi is a powerful, production-ready Python framework specifically designed for building AI-powered APIs. It combines the best of modern web frameworks with specialized features for AI model integration, making it the perfect choice for developers who want to build scalable AI services with minimal complexity.

🎯 Key Features

  • πŸ€– Native AI Model Integration - First-class support for multiple AI providers and custom models
  • πŸ”„ Model Context Protocol (MCP) - Built-in MCP implementation for seamless n8n integration
  • ⚑ High Performance - Async/await throughout with ASGI compliance
  • πŸ“Š Context Management - Persistent conversation contexts with automatic cleanup
  • πŸ›‘οΈ Production Ready - Rate limiting, authentication, monitoring, and error handling
  • πŸ—„οΈ Database Integration - SQLAlchemy async with automatic migrations
  • 🎨 DRF-Inspired Serializers - Familiar, powerful data validation and transformation
  • πŸš€ Auto-Generation - CLI tools for rapid project scaffolding
  • 🐳 Cloud Native - Docker support with production configurations

πŸš€ Quick Start

Installation

pip install nzrapi

Hello World Example (10 Lines!)

Create main.py:

from nzrapi import NzrApiApp

app = NzrApiApp()  # Zero config! Just works ✨

@app.get("/")
async def hello():
    return {"message": "Hello, World!"}

@app.get("/greet/{name}")
async def greet(name: str):
    return {"message": f"Hello, {name}!"}

if __name__ == "__main__":
    import uvicorn
    uvicorn.run(app, port=8000)

Run it:

python main.py
# Visit http://localhost:8000/docs for interactive API documentation

What you get for free in v1.0.2:

  • βœ… Zero Configuration - No boilerplate required
  • βœ… Automatic CORS - Enabled by default in debug mode
  • βœ… Error Handling - Production-ready error middleware
  • βœ… Request Logging - Track all API calls automatically
  • βœ… OpenAPI Docs - Interactive documentation at /docs
  • βœ… Type Validation - Automatic with Python type hints
  • βœ… Auto Response Wrapping - Dict/list automatically converted to JSON
  • βœ… Optional Database - No overhead unless you configure one

Create Your First Project

# Create a new minimal project (single file)
nzrapi new my-api --template minimal

# Or create a full MCP-enabled project
nzrapi new my-ai-api --template mcp_server

# Navigate to project
cd my-api

# Run the development server
python main.py

Your API is now running at http://localhost:8000! πŸŽ‰

πŸ€– AI Model Integration

nzrApi makes it incredibly easy to work with AI models:

from nzrapi.ai.models import AIModel

class MyCustomModel(AIModel):
    async def load_model(self):
        # Load your model (PyTorch, HuggingFace, OpenAI, etc.)
        self.model = load_my_model()
        self.is_loaded = True
    
    async def predict(self, payload, context=None):
        # Make predictions with optional context
        result = self.model.generate(payload["prompt"])
        return {"response": result}

# Register and use
app.ai_registry.register_model_class("custom", MyCustomModel)
await app.ai_registry.add_model("my_model", "custom", config={...})

Supported AI Providers

  • βœ… OpenAI (GPT-3.5, GPT-4, etc.)
  • βœ… Anthropic (Claude models)
  • βœ… HuggingFace (Transformers, Inference API)
  • βœ… Custom Models (PyTorch, TensorFlow, etc.)
  • βœ… Mock Models (for development and testing)

πŸ”„ Model Context Protocol (MCP)

nzrApi implements the Model Context Protocol for stateful AI interactions:

# MCP-compliant endpoint
@router.post("/mcp/{model_name}/predict")
async def mcp_predict(request, model_name: str):
    # Automatic context management
    mcp_request = MCPRequest(**(await request.json()))
    
    # Retrieve conversation context
    context = await get_context(mcp_request.context_id)
    
    # Make prediction with context
    model = request.app.ai_registry.get_model(model_name)
    result = await model.predict(mcp_request.payload, context)
    
    # Return MCP-compliant response
    return MCPResponse(
        request_id=mcp_request.request_id,
        context_id=mcp_request.context_id,
        result=result
    )

🎨 Powerful Serializers

nzrApi provides robust data validation:

from nzrapi.serializers import BaseSerializer, CharField, IntegerField

class ChatRequestSerializer(BaseSerializer):
    message = CharField(max_length=1000)
    user_id = CharField(required=False)
    temperature = FloatField(min_value=0.0, max_value=2.0, default=0.7)
    
    def validate(self, data):
        # Custom validation logic
        return data

# Use in endpoints
@router.post("/chat")
async def chat(request):
    data = await request.json()
    serializer = ChatRequestSerializer(data=data)
    
    if serializer.is_valid():
        validated_data = serializer.validated_data
        # Process with confidence...
    else:
        return JSONResponse(serializer.errors, status_code=422)

πŸ—„οΈ Database Integration

Built-in async database support with SQLAlchemy:

from nzrapi.db import Base
from sqlalchemy import Column, Integer, String, DateTime

class ConversationHistory(Base):
    __tablename__ = "conversations"
    
    id = Column(Integer, primary_key=True)
    user_id = Column(String(255), index=True)
    message = Column(Text)
    response = Column(Text)
    created_at = Column(DateTime, default=datetime.utcnow)

# Use in endpoints
@router.post("/chat")
async def chat(request):
    async with request.app.get_db_session() as session:
        # Save conversation
        conversation = ConversationHistory(
            user_id=user_id,
            message=message,
            response=response
        )
        session.add(conversation)
        await session.commit()

πŸ›‘οΈ Production Features

Automatic Middleware (v1.0.2+)

Zero configuration needed! CORS, error handling, and logging are enabled by default:

from nzrapi import NzrApiApp

# Development - allows all origins automatically
app = NzrApiApp(debug=True)

# Production - specify allowed origins
app = NzrApiApp(
    cors_origins=["https://app.n8n.cloud", "https://yourdomain.com"]
)

Rate Limiting

from nzrapi.middleware import RateLimitMiddleware

app.add_middleware(
    RateLimitMiddleware,
    calls_per_minute=60,
    calls_per_hour=1000
)

Authentication & Security

Simplified password hashing (v1.0.2+):

from nzrapi import create_password_hash, check_password_hash

# Hash password (salt is automatic!)
password_hash = create_password_hash("user_password")

# Verify password
is_valid = check_password_hash("user_password", password_hash)

JWT Authentication:

from nzrapi import JWTBearer, create_access_token, Depends

jwt_auth = JWTBearer(secret_key="your-secret", algorithm="HS256")

@router.get("/protected")
async def protected_route(payload: dict = Depends(jwt_auth)):
    return {"user": payload.get("sub"), "message": "Access granted"}

πŸ”§ CLI Tools

nzrApi includes powerful CLI tools for development:

# Create new project (v1.0.2+ supports minimal template)
nzrapi new my-project --template minimal      # Single-file template
nzrapi new my-project --template mcp_server   # Full MCP server
nzrapi new my-project --template api_server   # Standard API server

# Run development server
nzrapi run --reload --port 8000

# Database migrations
nzrapi migrate -m "Add user table"
nzrapi migrate --upgrade

# Model management
nzrapi models --list
nzrapi models --add openai_gpt4 --type openai

# Project info
nzrapi info

🌐 n8n Integration

Perfect for n8n workflows with built-in MCP support:

{
  "nodes": [{
    "name": "AI Chat",
    "type": "n8n-nodes-base.httpRequest",
    "parameters": {
      "url": "http://your-api.com/api/v1/mcp/gpt4/predict",
      "method": "POST",
      "body": {
        "context_id": "{{ $json.session_id }}",
        "payload": {
          "message": "{{ $json.user_input }}"
        }
      }
    }
  }]
}

πŸ“Š Monitoring & Observability

Built-in monitoring capabilities:

# Health checks
GET /health
GET /api/v1/models/{name}/health

# Metrics
GET /metrics
GET /api/v1/stats

# Usage analytics
GET /api/v1/usage/models
GET /api/v1/conversations/{context_id}

🐳 Docker Deployment

Production-ready containers:

FROM python:3.11-slim
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
EXPOSE 8000
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
# Build and run
docker build -t my-ai-api .
docker run -p 8000:8000 my-ai-api

# Or use docker-compose
docker-compose up -d

πŸ“š Examples

Check out our comprehensive examples (all updated for v1.0.2):

πŸ“– Documentation

🀝 Contributing

We welcome contributions! Please see our Contributing Guide for details.

# Development setup
git clone https://github.com/nzrapi/nzrapi.git
cd nzrapi
pip install -e ".[dev]"

# Run tests
pytest

# Run linting
black .
isort .
flake8

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

  • Built on the excellent Starlette foundations
  • Designed for seamless n8n integration
  • Community-driven development

πŸ”— Links


πŸŽ‰ What's New in v1.0.2

Zero-Config Framework - 85% Faster Setup!

  • ✨ Automatic Middleware - CORS, error handling, and logging enabled by default
  • ✨ Optional Request Parameter - Functions without request usage don't need to declare it
  • ✨ Auto Response Wrapping - Dict/list returns automatically wrapped in JSONResponse
  • ✨ Simplified Security - create_password_hash() / check_password_hash() (no salt needed!)
  • ✨ Smart Database - Optional database_url (no overhead when unused)
  • ✨ Minimal Template - New single-file template for rapid prototyping
  • ✨ Hello World Example - 10-line working API in the examples directory

See CHANGELOG.md for full release notes and MIGRATION_GUIDE.md for upgrade instructions.


Built with ❀️ for the AI community

nzrApi Framework - Making AI APIs Simple and Powerful

About

**nzrApi** is a powerful, production-ready Python framework specifically designed for building AI-powered APIs. It combines the best of modern web frameworks with specialized features for AI model integration, making it the perfect choice for developers who want to build scalable AI services with minimal complexity.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •