Business Process Monitoring & Management Platform
A REST API solution for real-time business process visualization, monitoring, and management.
The Process Dashboard API provides a platform for:
- Integration - RESTful design for seamless third-party integration
- Real-time Process Monitoring - Track business process execution across multiple systems
- Retry Management - Automated failure recovery with configurable retry strategies
- Security - Role-based API authentication with audit trails
- Data Retention - Configurable retention policies with GDPR-compliant data neutralization
- Features - Advanced filtering, search, and dashboard capabilities
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ Client Apps │───>│ FastAPI Core │───>│ SQL Server │
└─────────────────┘ └──────────────────┘ └─────────────────┘
│
▼
┌──────────────────┐
│ Docker Runtime │
└──────────────────┘
Component | Technology | Purpose |
---|---|---|
API Framework | FastAPI 0.104+ | High-performance async web framework |
Data Layer | SQLModel + SQLAlchemy | Type-safe ORM with automatic validation |
Database | Microsoft SQL Server | Enterprise-grade persistence layer |
Authentication | Custom JWT + API Keys | Multi-tier security with role-based access |
Deployment | Docker + Compose | Containerized deployment and scaling |
Package Management | UV | Fast Python dependency resolution |
graph TD
A[Client Request] --> B{API Key Valid?}
B -->|No| C[401 Unauthorized]
B -->|Yes| D{Role Check}
D -->|Admin| E[Full Access]
D -->|User| F[Data Access Only]
E --> G[Resource Access]
F --> G[Resource Access]
G --> H[Audit Log]
- Docker Desktop 4.0+
- Git
- Network access to SQL Server instance
-
Clone and Configure
git clone https://github.com/AAK-MBU/Process_Dashboard_API.git cd Process_Dashboard_API # Configure environment cp .env.example .env # Edit .env with your database connection details
-
Deploy with Docker
docker-compose up -d
-
Run Database Migrations (if upgrading from older version)
# Connect to your SQL Server and run migration scripts sqlcmd -S your-server -d process_monitor -i scripts/migrations/001_add_soft_delete_and_retention.sql # Or use Azure Data Studio / SQL Server Management Studio # See scripts/migrations/README.md for details
-
Initialize Security
# Create admin API key docker exec process-visualization-api uv run python scripts/add_api_key_roles.py # Note the returned admin key for initial setup
-
Verify Installation
curl -H "Authorization: Bearer YOUR_ADMIN_KEY" \ http://localhost:8000/api/v1/auth/verify
# .env file example
DATABASE_URL=mssql+pyodbc://username:password@server/database?driver=ODBC+Driver+18+for+SQL+Server
SECRET_KEY=your-super-secret-key-here
API_HOST=0.0.0.0
API_PORT=8000
DEBUG=false
The API implements a two-tier role-based access control system:
Role | Capabilities | Use Cases |
---|---|---|
Administrator | • Full API access • API key management • User administration • System configuration |
System administrators, DevOps teams |
User | • Process data access • Read/write operations • Dashboard access • Monitoring capabilities |
Applications, end users, integrations |
POST /api/v1/api-keys/
Authorization: Bearer {ADMIN_KEY}
Content-Type: application/json
{
"name": "Production Application Key",
"description": "API key for production dashboard application",
"role": "user",
"expires_at": "2025-12-31T23:59:59Z"
}
Response:
{
"id": 5,
"name": "Production Application Key",
"role": "user",
"key_prefix": "Secrect_key_prefix",
"key": "Secrect_key_",
"created_at": "2025-10-03T10:30:00Z",
"expires_at": "2025-12-31T23:59:59Z"
}
Security Notice: API keys are only displayed once during creation. Store them securely.
GET /api/v1/api-keys/
Authorization: Bearer {ADMIN_KEY}
DELETE /api/v1/api-keys/{key_id}
Authorization: Bearer {ADMIN_KEY}
GET /api/v1/auth/verify
Authorization: Bearer {API_KEY}
Response:
{
"status": "valid",
"message": "API key is valid",
"key_info": {
"id": 5,
"name": "Production Application Key",
"role": "user",
"usage_count": 127,
"last_used_at": "2025-10-03T10:29:45Z",
"is_active": true
}
}
GET /api/v1/auth/me
Authorization: Bearer {API_KEY}
The API implements comprehensive data retention and privacy management capabilities to support GDPR compliance and data lifecycle management.
- Soft Delete - Recoverable deletion with retention periods
- Configurable Retention - Set custom retention periods per process (e.g., 6, 12, 48 months)
- Automatic Neutralization - Scheduled removal of personally identifiable information (PII)
- Audit Trail - Complete tracking of all deletion and neutralization operations
PUT /api/v1/processes/{process_id}/retention
Authorization: Bearer {ADMIN_KEY}
Content-Type: application/json
{
"retention_months": 12
}
Common Retention Periods:
6
- 6 months (short-term operational data)12
- 1 year (standard business processes)48
- 4 years (regulatory compliance requirements)null
- No automatic retention (manual cleanup only)
DELETE /api/v1/processes/{process_id}
Authorization: Bearer {ADMIN_KEY}
Behavior:
- Process marked as deleted (not permanently removed)
- All associated runs and steps marked as deleted
- Data remains recoverable until retention period expires
POST /api/v1/processes/{process_id}/restore
Authorization: Bearer {ADMIN_KEY}
DELETE /api/v1/runs/{run_id}
Authorization: Bearer {ADMIN_KEY}
POST /api/v1/runs/{run_id}/restore
Authorization: Bearer {ADMIN_KEY}
POST /api/v1/runs/{run_id}/neutralize
Authorization: Bearer {ADMIN_KEY}
Neutralization Process:
entity_id
- Replaced with "NEUTRALIZED_{run_id}"entity_name
- Set to nullmeta
- Sensitive fields removed from JSON metadatais_neutralized
- Flag set to true (irreversible)
Example Before:
{
"entity_id": "12345678-1234",
"entity_name": "John Doe",
"meta": {
"email": "john.doe@example.com",
"phone": "+45 12 34 56 78",
"address": "Main Street 123"
}
}
Example After:
{
"entity_id": "NEUTRALIZED_42",
"entity_name": null,
"meta": {
"department": "Sales",
"region": "EMEA"
},
"is_neutralized": true
}
GET /api/v1/admin/cleanup/stats?limit=10
Authorization: Bearer {ADMIN_KEY}
Response:
{
"total_due_for_neutralization": 150,
"sample_run_ids": [42, 43, 44, 45, 46, 47, 48, 49, 50, 51]
}
POST /api/v1/admin/cleanup/neutralize?limit=100&dry_run=false
Authorization: Bearer {ADMIN_KEY}
Parameters:
limit
- Maximum number of runs to process (default: 100, max: 1000)dry_run
- Preview only without making changes (default: true)
Response:
{
"neutralized_count": 87,
"failed_count": 0,
"neutralized_run_ids": [42, 43, 44, ...],
"failed_run_ids": []
}
Process Run Created
│
▼
[scheduled_deletion_at = created_at + retention_months]
│
▼
Time Passes
│
▼
[scheduled_deletion_at reached]
│
▼
Admin Triggers Cleanup
│
▼
[PII Neutralized - Irreversible]
│
▼
[Run retained for statistical/audit purposes]
GET /api/v1/processes/
Authorization: Bearer {API_KEY}
# Query Parameters
?limit=50&offset=0
POST /api/v1/processes/
Authorization: Bearer {API_KEY}
Content-Type: application/json
{
"name": "Customer Onboarding",
"meta": {
"description": "Automated customer onboarding workflow",
"department": "Customer Success",
"owner": "system.automation",
"version": "2.1.0"
}
}
GET /api/v1/processes/{process_id}
Authorization: Bearer {API_KEY}
POST /api/v1/runs/
Authorization: Bearer {API_KEY}
Content-Type: application/json
{
"process_id": 1,
"entity_id": "CUST-20251003-001",
"entity_name": "Acme Corporation",
"meta": {
"initiated_by": "api.integration",
"priority": "high",
"environment": "production",
"correlation_id": "req-20251003-12345"
}
}
GET /api/v1/runs/
Authorization: Bearer {API_KEY}
# Advanced Filtering
?entity_name=Acme
&status=completed
&created_after=2025-10-01T00:00:00Z
&created_before=2025-10-31T23:59:59Z
&meta_filter=priority:high,environment:production
&limit=100
&offset=0
GET /api/v1/runs/{run_id}
Authorization: Bearer {API_KEY}
GET /api/v1/steps/process/{process_id}
Authorization: Bearer {API_KEY}
GET /api/v1/steps/process/{process_id}/rerunnable
Authorization: Bearer {API_KEY}
PATCH /api/v1/step-runs/{step_run_id}
Authorization: Bearer {API_KEY}
Content-Type: application/json
{
"status": "success",
"started_at": "2025-10-03T10:15:00Z",
"finished_at": "2025-10-03T10:17:30Z",
"meta": {
"processing_time_ms": 150000,
"records_processed": 1247,
"error_details": null,
"output_location": "s3://bucket/outputs/run-123.json"
}
}
POST /api/v1/step-runs/{step_run_id}/rerun
Authorization: Bearer {API_KEY}
Content-Type: application/json
{
"meta": {
"rerun_reason": "Temporary network issue resolved",
"initiated_by": "admin.user",
"rerun_strategy": "full_retry"
}
}
Response:
{
"id": 156,
"original_step_run_id": 123,
"status": "pending",
"created_at": "2025-10-03T10:30:00Z",
"message": "Step rerun initiated successfully"
}
Process steps can be configured for intelligent rerun capabilities:
{
"is_rerunnable": true,
"rerun_config": {
"subprocess_id": "data_extraction_v2",
"api_endpoint": "https://api.external-system.com/process",
"timeout_seconds": 300,
"max_retries": 3,
"retry_delays": [60, 120, 300],
"failure_conditions": ["network_error", "timeout", "rate_limit"]
}
}
sequenceDiagram
participant U as User
participant API as Process API
participant DB as Database
participant EXT as External System
U->>API: POST /step-runs/{id}/rerun
API->>DB: Validate step is rerunnable
API->>DB: Create new StepRun record
API->>EXT: Execute configured subprocess
EXT-->>API: Return results
API->>DB: Update step status
API->>U: Return rerun status
GET /api/v1/dashboard/process/{process_id}
Authorization: Bearer {API_KEY}
Response:
{
"process": {
"id": 1,
"name": "Customer Onboarding",
"total_steps": 8
},
"runs": [
{
"id": 123,
"entity_name": "Acme Corp",
"status": "completed",
"success_rate": 87.5
}
],
"statistics": {
"total_runs": 1247,
"completed_runs": 1156,
"failed_runs": 45,
"running_runs": 46,
"average_completion_time": "00:14:32"
}
}
Create a .env
file in your project root:
# Database Configuration
DATABASE_URL=mssql+pyodbc://username:password@server/database?driver=ODBC+Driver+17+for+SQL+Server
DB_ECHO=false
# Security Configuration
SECRET_KEY=your-super-secret-key-here-change-in-production
ALGORITHM=HS256
# Application Configuration
API_HOST=0.0.0.0
API_PORT=8000
DEBUG=false
ENVIRONMENT=production
API_V1_PREFIX=/api/v1
# CORS Configuration
CORS_ORIGINS=["http://localhost:3000","https://dashboard.company.com"]
For Docker deployments, use environment variables:
# docker-compose.yml
version: '3.8'
services:
api:
build: .
environment:
- DATABASE_URL=mssql+pyodbc://sa:YourPassword@db/ProcessDashboard?driver=ODBC+Driver+18+for+SQL+Server
- SECRET_KEY=production-secret-key-change-me
- API_HOST=0.0.0.0
- API_PORT=8000
- DEBUG=false
ports:
- "8000:8000"
depends_on:
- db
restart: unless-stopped
db:
image: mcr.microsoft.com/mssql/server:2022-latest
environment:
- SA_PASSWORD=YourPassword123!
- ACCEPT_EULA=Y
ports:
- "1433:1433"
# Clone repository
git clone https://github.com/AAK-MBU/Process_Dashboard_API.git
cd Process_Dashboard_API
# Setup virtual environment
python -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
# Install dependencies with uv
uv sync
# Setup environment
cp .env.example .env
# Edit .env with your configuration
# Initialize database
python scripts/migrate_db.py
# Start development server
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
- Docker & Docker Compose
- SQL Server (2019+ recommended)
- SSL Certificate (for HTTPS)
- Load Balancer (for high availability)
-- Create database
CREATE DATABASE ProcessDashboard;
-- Create application user
CREATE LOGIN process_api WITH PASSWORD = 'SecurePassword123!';
USE ProcessDashboard;
CREATE USER process_api FOR LOGIN process_api;
EXEC sp_addrolemember 'db_datareader', 'process_api';
EXEC sp_addrolemember 'db_datawriter', 'process_api';
EXEC sp_addrolemember 'db_ddladmin', 'process_api';
# Create first admin API key
docker exec -it process-dashboard-api python -c "
from app.database import get_session
from app.models import ApiKey
session = next(get_session())
admin_key = ApiKey.create_key(
name='Initial Admin Key',
description='Bootstrap admin key for initial setup',
role='admin'
)
session.add(admin_key)
session.commit()
print(f'Admin Key: {admin_key.key}')
"
GET /health
Response:
{
"status": "healthy",
"timestamp": "2025-10-03T10:30:00Z",
"version": "1.0.0",
"database": "connected",
"uptime": "7d 12h 30m"
}
GET /
Response:
{
"name": "Process Dashboard API",
"version": "1.0.0",
"description": "Enterprise process monitoring and management API",
"environment": "production",
"features": [
"api_key_authentication",
"role_based_access",
"step_rerun_system",
"metadata_filtering"
]
}
GET /api/v1/auth/usage-stats
Authorization: Bearer {API_KEY}
Response:
{
"key_id": 5,
"usage_count": 1247,
"last_used_at": "2025-10-03T10:29:45Z",
"daily_usage": 156,
"weekly_usage": 892,
"top_endpoints": [
{"endpoint": "/api/v1/runs/", "count": 45},
{"endpoint": "/api/v1/processes/", "count": 12}
]
}
# Test database connectivity
docker exec -it process-dashboard-api python -c "
from app.database import engine
try:
engine.connect()
print('Database connection successful')
except Exception as e:
print(f'Database connection failed: {e}')
"
Solutions:
- Verify connection string format
- Check SQL Server authentication mode
- Ensure ODBC driver 18+ is installed
- Verify network connectivity and firewall rules
{
"detail": "Invalid or expired API key"
}
Debugging Steps:
- Verify key format starts with
pd_
- Check key expiration date with
/api/v1/auth/verify
- Verify key is active in database
- Check rate limiting status
# Manual migration execution
docker exec -it process-dashboard-api python scripts/migrate_db.py
# Add role support to existing installation
docker exec -it process-dashboard-api python scripts/add_api_key_roles.py
Common Fixes:
- Ensure database user has DDL permissions
- Check for table name conflicts
- Verify SQL Server compatibility mode
{
"detail": "Descriptive error message",
"status_code": 400,
"error_type": "validation_error",
"timestamp": "2025-10-03T10:30:00Z"
}
- 200 - Success
- 201 - Created
- 400 - Bad Request / Validation Error
- 401 - Invalid or Missing API Key
- 403 - Insufficient Permissions (Wrong Role)
- 404 - Resource Not Found
- 422 - Unprocessable Entity
- 429 - Rate Limited
- 500 - Internal Server Error
-
Storage Security
- Never log full API keys
- Use secure key management systems
- Implement key escrow for critical operations
-
Access Control
- Follow principle of least privilege
- Separate admin and user keys by function
- Regular access reviews and audits
- HTTPS Only
- Force SSL/TLS encryption
- Use HSTS headers
- Implement certificate pinning
When the server is running, access:
- Swagger UI:
http://localhost:8000/docs
- ReDoc:
http://localhost:8000/redoc
- OpenAPI Schema:
http://localhost:8000/openapi.json
# Filter by custom metadata fields
GET /api/v1/runs/?meta_filter=department:Sales,priority:high,region:EMEA
Authorization: Bearer {API_KEY}
# Filter by time periods
GET /api/v1/runs/?created_after=2025-10-01T00:00:00Z&created_before=2025-10-31T23:59:59Z
Authorization: Bearer {API_KEY}
# Get available filterable fields for a process
GET /api/v1/processes/{process_id}/searchable-fields
Authorization: Bearer {API_KEY}
The project uses semantic versioning (MAJOR.MINOR.PATCH) with centralized version management.
Version Location: pyproject.toml
(single source of truth)
Use the Python script to update version across all project files:
# Bump patch version (1.0.0 -> 1.0.1)
python scripts/update_version.py patch
# Bump minor version (1.0.0 -> 1.1.0)
python scripts/update_version.py minor
# Bump major version (1.0.0 -> 2.0.0)
python scripts/update_version.py major
# Set specific version
python scripts/update_version.py 2.1.0
Updated Files:
pyproject.toml
- Project metadata.env.example
- Environment templateDockerfile
- Container configurationdocumentation/api_reference.md
- API documentation
Complete Guide: See scripts/VERSION_MANAGEMENT.md
for detailed documentation.
- API Reference: Interactive Docs
- Architecture Guide: System Design
- Visualization Examples: Dashboard Examples
- Data Retention Guide: Step Run Functionality
- Version Management: Version Management Guide
- Database Migrations: Migration Scripts
- FastAPI Documentation: https://fastapi.tiangolo.com/
- SQLModel Guide: https://sqlmodel.tiangolo.com/
- Docker Best Practices: https://docs.docker.com/develop/best-practices/
This project is licensed under the MIT License - see the LICENSE file for details.