Releases: fraxy-v/llama.cpp
Releases · fraxy-v/llama.cpp
b2873
b2872
Update llama.cpp Co-authored-by: slaren <slarengh@gmail.com>
b2871
post review
b2870
fix: vsnprintf terminates with 0, string use not correct
b2869
fix compile error
b2867
llama : rename jina tokenizers to v2 (#7249) * refactor: rename jina tokenizers to v2 * refactor: keep refactoring non-breaking
b2524
ggml : support AVX512VNNI (#6280) This change causes some quants (e.g. Q4_0, Q8_0) to go faster on some architectures (e.g. AMD Zen 4).
b2505
Merge branch 'master' into master