Skip to content

Commit

Permalink
Add ENV variables validation before running the discord bot (#42)
Browse files Browse the repository at this point in the history
* feat:added env validation with pydantic

* fix:added changed poetry config files

* fix:fixed ruff format errors

* fix:ruff formatted

* fix: remove unused vars, ignore mypy error

* chore: improve message when failed

* improve: remove Field default, add reason for mypy ignore errors

* chore: ruff formating

* fix: resolve mypy error cleaner, add a mypy plugin

* chore: clean .env.example

* fix: correct .env.example

* refactor: move env vars validation to a separate file

* chore: ruff formating

* fix: litellm config to direct phi to tinydolphin in testing

* refactor: only get Settings class

* fix (work in devcontainer): change path, shorten import

* chore: use Settings() even cleaner

* docs: add explanation

* test: model name is still phi

---------

Co-authored-by: Nhat Quang Nguyen <nngu0112@student.monash.edu>
Co-authored-by: Rodo <rodonguyendd@gmail.com>
  • Loading branch information
3 people authored Mar 10, 2024
1 parent d53db42 commit 1a5f928
Show file tree
Hide file tree
Showing 8 changed files with 39 additions and 12 deletions.
1 change: 0 additions & 1 deletion .env.example
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
# LiteLLM/OpenAI
AI_SERVER_URL="http://localhost:8000"
OPENAI_API_KEY="ee-example8888888888888888" # needed for litellm to talk to OpenAI

# Discord Bot
DISCORD_BOT_TOKEN="EXAMPLETokenTTTTT.Grz0fK.IUi1IONPcLHZsnyjROTt8lR2fuYCeK4grzRoSQ"
Expand Down
21 changes: 20 additions & 1 deletion poetry.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 3 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@ litellm = {extras = ["proxy"], version = "^1.23.7"}
openai = "^1.11.1"
python-dotenv = "^1.0.1"
discord-py-interactions = "5.11.0"
pydantic-settings = "^2.2.1"
pydantic = "^2.6.3"

[tool.poetry.group.dev.dependencies]
pre-commit = "^3.6.1"
Expand Down Expand Up @@ -79,6 +81,7 @@ skip-magic-trailing-comma = false
line-ending = "lf"

[tool.mypy]
plugins = [ "pydantic.mypy" ]
strict = true
check_untyped_defs = true
disallow_untyped_defs = true
Expand Down
10 changes: 4 additions & 6 deletions src/discord_bot/bot.py
Original file line number Diff line number Diff line change
@@ -1,13 +1,11 @@
import os
import interactions
import dotenv
from discord_bot.llm import answer_question
from llm import answer_question
from settings import Settings

dotenv.load_dotenv()

MODEL_CHOICES = ["gpt-3.5-turbo", "gpt-4", "phi"]
DISCORD_BOT_TOKEN = os.getenv("DISCORD_BOT_TOKEN")
AI_SERVER_URL = os.getenv("AI_SERVER_URL") or "http://localhost:8000"
DISCORD_BOT_TOKEN = Settings().DISCORD_BOT_TOKEN
AI_SERVER_URL = Settings().AI_SERVER_URL

bot = interactions.Client(intents=interactions.Intents.DEFAULT)

Expand Down
4 changes: 2 additions & 2 deletions src/discord_bot/llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,8 @@ async def answer_question(model: str, question: str, server_url: str) -> str:
model=model,
messages=[{"role": "user", "content": question}],
)
out = response.choices[0].message.content or "No response from the model"

out = response.choices[0].message.content or "No response from the model. Please try again"
return out

except Exception as e:
return f"Error: {e}"
8 changes: 8 additions & 0 deletions src/discord_bot/settings.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
from pydantic_settings import BaseSettings, SettingsConfigDict


class Settings(BaseSettings):
model_config = SettingsConfigDict(env_file=".env", env_file_encoding="utf-8")

DISCORD_BOT_TOKEN: str
AI_SERVER_URL: str = "http://localhost:8000"
2 changes: 1 addition & 1 deletion src/litellm/proxy_config_for_testing.yaml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
model_list:
- model_name: tinydolphin
- model_name: phi # Route requests intended for the `phi` model to `tinydolphin` instead since only `phi` option (not `tinydolphin`) is available in `src/discord_bot/bot.py`
litellm_params:
model: ollama/tinydolphin

Expand Down
2 changes: 1 addition & 1 deletion tests/test_discord_bot_llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@

@pytest.mark.asyncio
async def test_answer_question__LLM_should_response() -> None:
model = "tinydolphin"
model = "phi"
prompt = "Respond shortly: hello!"

response = await answer_question(model, prompt, AI_SERVER_URL)
Expand Down

0 comments on commit 1a5f928

Please sign in to comment.