This repository contains MCP (Model Context Protocol) servers for integrating with OpenAI's o1 model and Flux capabilities.
The o1 server enables interaction with OpenAI's o1 preview model through the MCP protocol.
{
"mcpServers": {
"openai": {
"command": "openai-server",
"env": {
"OPENAI_API_KEY": "apikey"
}
}
}
}
Key features:
- Direct access to o1-preview model
- Streaming support
- Temperature and top_p parameter control
- System message configuration
The Flux server provides integration with Flux capabilities through MCP.
{
"mcpServers": {
"flux": {
"command": "flux-server",
"env": {
"REPLICATE_API_TOKEN": "your-replicate-token"
}
}
}
}
Key features:
- SOTA Image Model
- Clone or Fork Server
git clone https://github.com/AllAboutAI-YT/mcp-servers.git
- Set up environment variables in your .env file:
FLUX_API_KEY=your_flux_key_here
- Start the servers using the configurations above.
- Store API keys securely
- Use environment variables for sensitive data
- Follow security best practices in SECURITY.md
MIT License - See LICENSE file for details.