This is flaskchat, which uses multiple LLM models to write stories, or have philosophical discussions, or anything else you want to set the prompts for.
I use this to figure out strategies for multiple model, chain of thought, tree of thought, or alternate reasoning models.
Two parts, flaskapi_gpt.py and main.py (or the variations that use different backend storage.)
Uses openai and cohere as these are some good models with reasonable api prices.
python main.py # for list storage
python main-mongoback.py # for mongodb or ferretdb backend
python main-redisback.py # for redis backend
In separate window:
python flaskapi_gpt.py # for main application
- python >= 3.9
- cohere and openai api keys
- cohere (python client)
- openai (python client)
- redis server as available
- mongodb server as available
- Clone the repository.
git clone https://github.com/jtatman/flaskchat.git
- Create a virtual environment
Use micromamba, why not?
"${SHELL}" <(curl -L micro.mamba.pm/install.sh)
micromamba create -n flaskchat python=3.9
or venv
python -m venv .venv
.venv/bin/activate
- Install requirements
pip install -r requirements.txt
-
Assumes mongo and redis run in servers at localhost or docker. If not, adjust in respective files.
-
Docker or podman
- To run mongo version:
docker compose -f docker-compose-mongo.yml build # build
docker compose -f docker-compose-mongo.yml up # bring up in screen or a spare terminal
docker compose -f docker-compose-mongo.yml -d # bring up in daemon mode
- To run redis version:
docker compose -f docker-compose-redis.yml build # build
docker compose -f docker-compose-redis.yml up # bring up in screen or a spare terminal
docker compose -f docker-compose-redis.yml -d # bring up in daemon mode
- Podman:
pip3 install podman-compose
podman-compose up -f docker-compose-mongo.yml -d
# or
podman-compose up -f docker-compose-redis.yml -d
- Kubernetes:
Use Kompose to convert the compose yamls and proceed as needed. Use k0s or k3s for prototyping. Export pods to whereever. Have fun.
MIT