- 
                Notifications
    You must be signed in to change notification settings 
- Fork 365
Description
🧭 Epic
Title: Portable Configuration Export & Import
Goal: Allow operators to dump Gateways (Registry), Virtual Servers and Prompts to a single file and replay that file through the public Admin API on another MCP Gateway instance.
Why now: Simplifies backup/restore, blue-green upgrades and multi-cluster promotion of vetted configs.
🧭 Type of Feature
- New functionality
🙋♂️ User Story 1
As a: Platform engineer
I want: a CLI sub-command mcpgateway export --out config-YYYYMMDD.json
So that: I can capture the current Gateways, Servers and Prompts (optionally filtered by labels) in one shot.
✅ Acceptance Criteria
Scenario: Default export
Given a running MCP Gateway with 3 gateways, 5 virtual servers and 12 prompts
When I run `mcpgateway export --out file.json`
Then the file must contain "version":"2025-03-26"
And three objects under $.gateways
And five objects under $.servers
And twelve objects under $.prompts
And every object MUST include its `auth_type` and encrypted `auth_value` if present🙋♂️ User Story 2
As a: SRE restoring into a fresh cluster
I want: mcpgateway import file.json
So that: the target Gateway recreates every resource idempotently (update-or-create).
✅ Acceptance Criteria
Scenario: Idempotent import
Given an empty target Gateway
When I run `mcpgateway import file.json`
Then the API should POST /gateways, /servers, /prompts in that order
And if a name already exists it should PATCH instead of POST
And the command must finish with exit code 0🙋♂️ User Story 3
As a: Security officer
I want: encrypted credentials to stay opaque in the export
So that: we never leak plaintext secrets in source control.
✅ Acceptance Criteria
Scenario: Encryption boundary
Given a prompt that calls a tool with Basic auth
When it is exported
Then the resulting JSON must show "auth_value":"<base64-AES…>"
And NOT show "username" nor "password"
And import must succeed as long as AUTH_ENCRYPTION_SECRET is identical📦 Export File Schema (via jq or JSON Schema)
Every block uses the same field names returned by GET /gateways, GET /servers and GET /prompts, so import can stream the objects right back with POST/PATCH requests.
🔐 Encryption Notes
- auth_valueis produced by- encode_auth()(AES-256-GCM derived from- AUTH_ENCRYPTION_SECRET) and parsed by- decode_auth()during import.
- Keep the passphrase identical across environments; otherwise re-encrypt before import.
- Non-secret fields remain plaintext for diff-friendliness.
📐 Design Sketch
flowchart TD
    CLI["mcpgateway export"]
    APIAdmin[/Admin API/]
    DB[(SQLite/Postgres)]
    CLI -- SQLAlchemy read --> DB
    CLI -- JSON file --> FS[Filesystem]
    FS -- file --> CLI2["mcpgateway import"]
    CLI2 -- REST --> APIAdmin
    APIAdmin -- ORM write --> DB
    | Component | Change | Detail | 
|---|---|---|
| mcpgateway.cli.export.py | NEW | Assemble JSON, respect --filter label=prod, pretty-print | 
| mcpgateway.cli.import.py | NEW | Read file, validate schema, upsert via internal Admin service | 
| Admin API | None | Re-use existing /gateways,/servers,/promptsendpoints | 
| Docs | Add examples | Include sample export, encryption caveats | 
🔄 Alternatives Considered
| Alternative | Pros | Cons | 
|---|---|---|
| DB dump via pg_dump | Fast, proven | Couples to SQL dialect, exposes secrets in plaintext | 
| Helm chart values | GitOps-friendly | Loses runtime edits, can’t carry encrypted auth | 
| Separate files per resource type | Fine-grained git history | Harder to ensure referential integrity | 
📓 Additional Context
- Encryption key configurable (AUTH_ENCRYPTION_SECRET) in.envor Helm values.
- The same JSON can be pushed to multiple environments for canary testing.
- Import respects naming uniqueness to avoid collisions.
Usage quick-start
# backup
mcpgateway export --out mcp-config-$(date +%F).json
# restore into a new cluster
export AUTH_ENCRYPTION_SECRET=$(cat /run/secrets/mcp_secret)
mcpgateway import mcp-config-2025-06-26.jsonFill in the placeholders, commit the file to your infra-as-code repo, and you have a portable, encrypted snapshot ready for re-use.
{ "version": "2025-03-26", "exported_at": "2025-06-26T12:34:56Z", "gateways": [ { "name": "hq-east", "url": "https://east.example.net", "description": "HQ primary", "auth_type": "basic", "auth_value": "<encrypted>", // AES-GCM + base64 "capabilities": { ... }, "is_active": true } // … ], "servers": [ { "name": "openai-proxy", "url": "https://api.openai.com/v1/chat/completions", "description": "Wrapped OpenAI endpoint", "headers": { "OpenAI-Org": "my-org" }, "auth_type": "bearer", "auth_value": "<encrypted>", "health_url": "https://api.openai.com/v1/models" } // … ], "prompts": [ { "name": "summarise_doc", "template": "Summarise the following text:\n{{ text }}", "input_schema": { "type": "object", "properties": { "text": { "type": "string" } } }, "description": "Generic text-summariser" } // … ] }