Skip to content

Commit

Permalink
Merge pull request #49 from semanser/multi-llm-support
Browse files Browse the repository at this point in the history
Add multi LLMs support
  • Loading branch information
semanser authored Apr 4, 2024
2 parents 47d0d31 + 63b7817 commit 520eb91
Show file tree
Hide file tree
Showing 47 changed files with 2,321 additions and 873 deletions.
38 changes: 14 additions & 24 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,47 +14,37 @@
- 🤳 Self-hosted
- 💅 Modern UI

# Usage
# Getting started
The simplest way to run Codel is to use a pre-built Docker image. You can find the latest image on the [Github Container Registry](https://github.com/semanser/codel/pkgs/container/codel).


> [!IMPORTANT]
> Don't forget to set the required environment variables.
> You need to use a corresponding environment variable in order to use any of the supported language models.
You can run the Docker image with the following command. Remove or change the environment variables according to your needs.
```bash
docker run -d \
-e OPEN_AI_KEY=<your_open_ai_key> \
docker run \
-e OPEN_AI_KEY=your_open_ai_key \
-e OPEN_AI_MODEL=gpt-4-0125-preview \
-e OLLAMA_MODEL=llama2 \
-p 3000:8080 \
-v /var/run/docker.sock:/var/run/docker.sock \
ghcr.io/semanser/codel:latest
```

Alternatively, you can create a .env file and run the Docker image with the following command:
```bash
docker run -d \
--env-file .env \
-p 3000:8080 \
-v /var/run/docker.sock:/var/run/docker.sock \
ghcr.io/semanser/codel:latest
```
Alternatively, you can create a `.env` file and run the Docker image with the `--env-file` flag. More information can be found [here](https://docs.docker.com/reference/cli/docker/container/run/#env)

Now you can visit [localhost:3000](localhost:3000) in your browser and start using Codel.

<details>
<summary>Required environment variables</summary>

- `OPEN_AI_KEY` - OpenAI API key
</details>

<details>
<summary>Optional environment variables</summary>

- `OPEN_AI_MODEL` - OpenAI model (default: gpt-4-0125-preview). The list of supported OpenAI models can be found [here](https://pkg.go.dev/github.com/sashabaranov/go-openai#pkg-constants).
- `DATABASE_URL` - PostgreSQL database URL (eg. `postgres://user:password@localhost:5432/database`)
- `DOCKER_HOST` - Docker SDK API (eg. `DOCKER_HOST=unix:///Users/<my-user>/Library/Containers/com.docker.docker/Data/docker.raw.sock`) [more info](https://stackoverflow.com/a/62757128/5922857)
- `PORT` - Port to run the server in the Docker container (default: 8080)
<summary>Supported environment variables</summary>

* `OPEN_AI_KEY` - OpenAI API key. You can get the key [here](https://platform.openai.com/account/api-keys).
* `OPEN_AI_MODEL` - OpenAI model (default: gpt-4-0125-preview). The list of supported OpenAI models can be found [here](https://pkg.go.dev/github.com/sashabaranov/go-openai#pkg-constants).
* `OPEN_AI_SERVER_URL` - OpenAI server URL (default: https://api.openai.com/v1). Change this URL if you are using an OpenAI compatible server.
* `OLLAMA_MODEL` - locally hosted Ollama model (default: https://ollama.com/model). The list of supported Ollama models can be found [here](https://ollama.com/models).
See backend [.env.example](./backend/.env.example) for more details.

</details>

# Development
Expand Down
16 changes: 13 additions & 3 deletions backend/.env.example
Original file line number Diff line number Diff line change
@@ -1,6 +1,16 @@
OPEN_AI_KEY=
OPEN_AI_SERVER_URL=
# General
DATABASE_URL=
DOCKER_HOST=
PORT=

# OpenAI
OPEN_AI_KEY=
OPEN_AI_SERVER_URL=
OPEN_AI_MODEL=
PORT=

# Ollama
OLLAMA_MODEL=

# Goose
GOOSE_DRIVER=
GOOSE_DBSTRING=
Loading

0 comments on commit 520eb91

Please sign in to comment.