pip install -r requirements.txt
- Create
Recipesdirectory inlearn-mcpdirectory. - From
learn-mcprunpython servers/recipes/server.py
- Export your OPENAI API key by running the following
export OPENAI_API_KEY="..." - Run the command
llama stack run run.yml
- Get the
hostandportfrom the llama-stack server - Run the following command
python chat.py <host> <port>