server/
├── src/
│ ├── mcp/ # MCP server implementation
│ │ ├── mcpServer.ts # Core MCP server with tool definitions
│ │ └── mcpBedrockBridge.ts # Bridge between MCP and Bedrock
│ └── services/
│ └── mcpBedrockService.ts # Bedrock integration service
frontend/
└── src/
└── components/ # React components for the chat UI
- Node.js (v18+) and npm installed
- AWS account with Bedrock access
- AWS CLI configured with appropriate credentials
- Clone the repository:
git clone https://github.com/yourusername/ai-chat-demo.git
cd ai-chat-demo
- Create a
.env
file in the project root:
# AWS Bedrock configuration
AWS_REGION=eu-central-1
BEDROCK_MODEL_ID=anthropic.claude-3-sonnet-20240229-v1:0
# Server settings
PORT=3001
- Install all dependencies:
npm run install:all
- Start the application (both backend and frontend):
npm run dev
- Open your browser and visit
http://localhost:5173
To debug the MCP server directly using the inspector:
cd server
npx @modelcontextprotocol/inspector node dist/mcp/mcpServerRunner.js
This will open a web interface where you can:
- View available tools
- Execute tool calls
- Inspect the JSON-RPC messages
This repository contains a demo application that shows how to build an AI agent using Model Context Protocol (MCP) with AWS Bedrock. It demonstrates how to integrate powerful language models with external tools through a standardized protocol.
This application illustrates several AI interaction patterns:
- Normal AI Chat: Simple conversation with a language model without external capabilities.
- AI Workflows: Predefined sequences of AI and tool operations with human design.
- AI Agents: Dynamic systems that decide when and how to use tools based on user requests. The code highlights key differences in architecture, capabilities, and use cases for each approach.
The project explores the evolution of function calling in language models:
- Traditional function calling involves direct implementation within each model.
- MCP (Model Context Protocol) offers several advantages:
- Provides a standardized interface for tools across different models.
- Decouples tool implementation from model providers, enhancing flexibility.
- Improves security by standardizing tool interactions.
- Simplifies the development and maintenance of AI agents. This demo showcases how MCP enhances the agent development experience.
The application includes a standalone MCP server that implements the tools available to the AI agent. Key aspects include:
- Using the MCP Inspector to visualize and debug the server.
- Defining tool schemas (using JSON Schema) and their corresponding implementations.
- Demonstrating the JSON-RPC protocol for tool discovery and execution.
A crucial part of this application is the bridge between MCP and AWS Bedrock:
- It addresses why translation is necessary between MCP and Bedrock's native tool-use formats.
- The architecture of this bridge component is explained.
- The data flow is illustrated:
- User request is sent to Bedrock.
- Bedrock's tool request is translated into an MCP tool call.
- The MCP tool executes, and its result is formatted back for Bedrock.
- The system handles multi-turn conversations involving tool usage.
The application provides a complete end-to-end demonstration of an AI agent using MCP and Bedrock, highlighting:
- Key takeaways and best practices for building such systems.