This project is a starter Chainlit application that demonstrates a simple integration with OpenAI's API. It showcases the following key features:
-
OpenAI Integration: The app is connected to OpenAI's API, allowing it to leverage state-of-the-art language models for generating responses.
-
Streaming Responses: Instead of waiting for the entire response to be generated, the app streams the AI's response in real-time, providing a more interactive and engaging user experience.
-
Chat History: The application maintains a conversation history, enabling context-aware responses and allowing for more coherent and meaningful interactions.
-
Environment Variable Management: Sensitive information like API keys are managed securely using environment variables.
-
LangSmith Integration: The app includes LangSmith for tracing and monitoring AI interactions, which can be useful for debugging and optimizing your AI application.
As a convenience, on start of a new chat session, a system prompt is added as the first message in the chat history.
First, create a virtual environment to isolate the project dependencies:
python -m venv .venv
- On Windows:
.venv\Scripts\activate
- On macOS and Linux:
source .venv/bin/activate
Install the project dependencies from the requirements.txt
file:
pip install -r requirements.txt
- Copy the
.env.sample
file to a new file named.env
- Fill in the
.env
file with your API keys
To run the app, use the following command:
chainlit run app.py -w
If you need to update the project dependencies, follow these steps:
-
Update the
requirements.in
file with the new package or version. -
Install
pip-tools
if you haven't already:pip install pip-tools
-
Compile the new
requirements.txt
file:pip-compile requirements.in
-
Install the updated dependencies:
pip install -r requirements.txt
This process ensures that all dependencies are properly resolved and pinned to specific versions for reproducibility.