This is a Python implementation of the aquif-2 model, which is a chatbot based on Qwen 2.5 that can be used locally.
To use the model, you need to have Ollama installed and running on your machine on port 12345. You can find the installation file here.
Once you have Ollama running, you can run the model by executing the following command:
py main.py
This will make aquif-2 answer the question in the prompt
variable declared on line 57.
Contributions are welcome! If you find any bugs or have suggestions for improvements, please open an issue or submit a pull request on the GitHub repository.
This project is licensed under the CC BY-NC 4.0 license. You can check it on the link or in the LICENSE file.