Skip to content

Releases: acon96/home-llm

v0.3.7

15 Dec 19:43
Compare
Choose a tag to compare

Update llama.cpp version to support newer models, Update minimum Home Assistant version to 2024.12.3, Add German In-Context Learning examples, Fix multi-turn use, Fix an issue with webcolors

v0.3.6

21 Aug 23:08
f037241
Compare
Choose a tag to compare

Small llama.cpp backend fixes

v0.3.5

20 Aug 23:23
Compare
Choose a tag to compare

Fix for llama.cpp backend installation, Fix for Home LLM v1-3 API parameters, add Polish ICL examples

v0.3.4

12 Aug 02:00
f6cb969
Compare
Choose a tag to compare

Significantly improved language support including full Polish translation, Update bundled llama-cpp-python to support new models, various bug fixes

v0.3.3

15 Jun 22:41
75813de
Compare
Choose a tag to compare

Improvements to the Generic OpenAI Backend, improved area handling, fix issue using RGB colors, remove EOS token from responses, replace requests dependency with aiohttp included with Home Assistant

v0.3.2

08 Jun 20:58
9f7aa19
Compare
Choose a tag to compare

Fix for exposed script entities causing errors, fix missing GBNF error, trim whitespace from model output

v0.3.1

08 Jun 17:24
f407e53
Compare
Choose a tag to compare

Adds basic area support in prompting, Fix for broken requirements, fix for issue with formatted tools, fix custom API not registering on startup properly

v0.3

07 Jun 04:22
71b7207
Compare
Choose a tag to compare

NOTE: This is a breaking change and will require you to re-configure any models you have set up.

Adds support for Home Assistant LLM APIs, improved model prompting and tool formatting options, and automatic detection of GGUF quantization levels on HuggingFace

v0.2.17

09 May 01:21
d64f3a2
Compare
Choose a tag to compare

Disable native llama.cpp wheel optimizations, add Command R prompt format

v0.2.16

04 May 17:55
0cb361c
Compare
Choose a tag to compare

Fix for missing huggingface_hub package preventing startup