We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ggml_new_tensor_impl
The solution is simple. As per llama.cpp (https://github.com/ggerganov/llama.cpp/blob/f76cb3a34d6a6b03afb96650e39495f201eac042/llama.cpp#L933), set ctx.no_alloc to true.
ctx.no_alloc
The text was updated successfully, but these errors were encountered:
Oops, wrong repo
Sorry, something went wrong.
No branches or pull requests
The solution is simple. As per llama.cpp (https://github.com/ggerganov/llama.cpp/blob/f76cb3a34d6a6b03afb96650e39495f201eac042/llama.cpp#L933), set
ctx.no_alloc
to true.The text was updated successfully, but these errors were encountered: