We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
python -m llama_cpp.server --model path.gguf
Exception: 'coroutine' object is not callable Traceback (most recent call last): File "/azureml-envs/minimal/lib/python3.11/site-packages/llama_cpp/server/errors.py", line 173, in custom_route_handler response = await original_route_handler(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/azureml-envs/minimal/lib/python3.11/site-packages/fastapi/routing.py", line 301, in app raw_response = await run_endpoint_function( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/azureml-envs/minimal/lib/python3.11/site-packages/fastapi/routing.py", line 212, in run_endpoint_function return await dependant.call(**values) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/azureml-envs/minimal/lib/python3.11/site-packages/llama_cpp/server/app.py", line 491, in create_chat_completion llama = llama_proxy(body.model) ^^^^^^^^^^^^^^^^^^^^^^^ TypeError: 'coroutine' object is not callable INFO: ::1:42174 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
The text was updated successfully, but these errors were encountered:
I also encountered this error, which was introduced in version 0.3.3. Now I rolled back to 0.3.2 and it works.
pip install llama-cpp-python==0.3.2
Sorry, something went wrong.
Need to add missing await, fixes here #1858
await
Fixed in version 0.3.5
pip install 'llama-cpp-python>=0.3.5'
No branches or pull requests
python -m llama_cpp.server --model path.gguf
.The text was updated successfully, but these errors were encountered: