-
Notifications
You must be signed in to change notification settings - Fork 73
Issues: acon96/home-llm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Home-LLM takes over other LLM options
bug
Something isn't working
#238
opened Dec 17, 2024 by
starman2k01
Install contradictory-- choose Assist or "Home-LLM (v1-v3)
enhancement
New feature or request
#232
opened Dec 6, 2024 by
starman2k01
Replace Multi Turn tool use with automatic detection
enhancement
New feature or request
#230
opened Nov 18, 2024 by
Nixellion
Error when trying to perform multiple actions
bug
Something isn't working
#226
opened Oct 23, 2024 by
nspitko
Is it possible to stream LLM responses without waiting for a full response?
enhancement
New feature or request
#224
opened Oct 14, 2024 by
witold-gren
Problems with API Hostname Configuration Leading to Connection Errors in OpenRouter.ai
bug
Something isn't working
#220
opened Sep 29, 2024 by
lukcz
Local llm doesnt create Conversation Agent
bug
Something isn't working
#215
opened Sep 5, 2024 by
MaybeFunfact
Music Assistant Support? Seems so close to working!
enhancement
New feature or request
#212
opened Aug 22, 2024 by
Someguitarist
Ollama: Unexpected error during intent recognition
bug
Something isn't working
#211
opened Aug 22, 2024 by
Teagan42
How to properly run model training on 1 RTX4090 graphics card?
bug
Something isn't working
#203
opened Aug 13, 2024 by
witold-gren
Llama.cpp whl is not downloaded - Pip returned an error while installing the wheel!
bug
Something isn't working
#202
opened Aug 13, 2024 by
BDvirus
Support "percentage" attribute (e.g. set_percentage for fan speed) in Home-LLM API
enhancement
New feature or request
#201
opened Aug 12, 2024 by
tarocco
Problem talking to the backend
bug
Something isn't working
#188
opened Jul 23, 2024 by
andreas-bulling
Unexpected error during intent recognition [Using LocalaI]
bug
Something isn't working
#186
opened Jul 19, 2024 by
maxi1134
Refresh aliases without needing to restart Home Assistant (or reload LLM services)
enhancement
New feature or request
#185
opened Jul 13, 2024 by
tarocco
Support for lower parameter models
enhancement
New feature or request
#184
opened Jul 13, 2024 by
panikinator
Failed to properly initialize llama-cpp-python. (Exit code 1.)
bug
Something isn't working
#176
opened Jun 20, 2024 by
912-Cireap-Bogdan
Can not download develop branch of the HACS intergration
bug
Something isn't working
#172
opened Jun 15, 2024 by
Gaffers2277
Unexpected error during intent recognition (not enough values to unpack)
bug
Something isn't working
#171
opened Jun 14, 2024 by
caphector
Idea to integrate command confirmation with yes or no, to avoid errors from hallucinations
enhancement
New feature or request
#146
opened May 14, 2024 by
codernerds
v0.2.16 - intregration failed to configure bug
bug
Something isn't working
#140
opened May 4, 2024 by
pbn42
Previous Next
ProTip!
Updated in the last three days: updated:>2024-12-24.