You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ubuntu 22.04, Installed LARS today using instruction from your github readme. Using Conda environment.
llama-server --version version: 3912 (edc26566)
built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu
added to path in barshrc
model is in model directory.
if I run llama-server -m /home/asus/builds/LARS/lars_storage/models/dolphin-2.9.3-mistral-7B-32k-Q8_0.gguf -c 2048 from any where the server starts and I can chat with it.
main: server is listening on 127.0.0.1:8080 - starting the main loop
srv update_slots: all slots are idle
Removed the LARS folder and the Conda env. Created a venv per your readme and reinstalled everything. Same issue as before.
I can run llama-server stand alone. I can run Ollama or LMStudio or Open webUI. So sure sure where the disconnect here is.
Your app works other than it will not start either server.
Tried in various browsers no change.
Tried from another computer on the network. no change
Ubuntu 22.04, Installed LARS today using instruction from your github readme. Using Conda environment.
llama-server --version version: 3912 (edc26566)
built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu
added to path in barshrc
model is in model directory.
if I run llama-server -m /home/asus/builds/LARS/lars_storage/models/dolphin-2.9.3-mistral-7B-32k-Q8_0.gguf -c 2048 from any where the server starts and I can chat with it.
main: server is listening on 127.0.0.1:8080 - starting the main loop
srv update_slots: all slots are idle
config.json
lars_server_log.log
llama_cpp_server_output_log.txt
The hf_waitress output log is blank and no json was created for it
What else do i need to give you to help troubleshoot?
The text was updated successfully, but these errors were encountered: