Releases: acon96/home-llm
v0.3.7
v0.3.6
Small llama.cpp backend fixes
v0.3.5
Fix for llama.cpp backend installation, Fix for Home LLM v1-3 API parameters, add Polish ICL examples
v0.3.4
Significantly improved language support including full Polish translation, Update bundled llama-cpp-python to support new models, various bug fixes
v0.3.3
Improvements to the Generic OpenAI Backend, improved area handling, fix issue using RGB colors, remove EOS token from responses, replace requests dependency with aiohttp included with Home Assistant
v0.3.2
Fix for exposed script entities causing errors, fix missing GBNF error, trim whitespace from model output
v0.3.1
Adds basic area support in prompting, Fix for broken requirements, fix for issue with formatted tools, fix custom API not registering on startup properly
v0.3
NOTE: This is a breaking change and will require you to re-configure any models you have set up.
Adds support for Home Assistant LLM APIs, improved model prompting and tool formatting options, and automatic detection of GGUF quantization levels on HuggingFace
v0.2.17
Disable native llama.cpp wheel optimizations, add Command R prompt format
v0.2.16
Fix for missing huggingface_hub package preventing startup