Skip to content

Commit

Permalink
fix hooks score computation
Browse files Browse the repository at this point in the history
old results were just the mean, so need to be multiplied to give correct output
  • Loading branch information
slobentanzer committed Jul 25, 2024
1 parent 07bff66 commit 9a7d2b5
Show file tree
Hide file tree
Showing 32 changed files with 592 additions and 582 deletions.
8 changes: 4 additions & 4 deletions benchmark/results/processed/correlations.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Size vs accuracy Pearson correlation: 0.22253111740340267
Size vs accuracy Pearson correlation p-value: 1.0851737884181684e-08
Quantisation vs accuracy Pearson correlation: 0.24295283470580717
Quantisation vs accuracy Pearson correlation p-value: 3.923401309641152e-10
Size vs accuracy Pearson correlation: 0.17210401537258263
Size vs accuracy Pearson correlation p-value: 1.0891261530140134e-05
Quantisation vs accuracy Pearson correlation: 0.20302475924649943
Quantisation vs accuracy Pearson correlation p-value: 1.9453161287669737e-07
6 changes: 3 additions & 3 deletions benchmark/results/processed/end_to_end_query_generation.csv
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Full model name,Score achieved,Score possible,Score SD,Accuracy,Iterations
gpt-3.5-turbo-0125,27.8,150.0,0,0.18533333333333335,5
gpt-4-0613,26.4,150.0,0,0.176,5
gpt-3.5-turbo-0613,25.0,150.0,0,0.16666666666666666,5
gpt-3.5-turbo-0125,139.0,150.0,0,0.9266666666666666,5
gpt-4-0613,132.0,150.0,0,0.88,5
gpt-3.5-turbo-0613,125.0,150.0,0,0.8333333333333334,5
chatglm3:6:ggmlv3:q4_0,0.0,150.0,0,0.0,5
llama-2-chat:70:ggufv2:Q5_K_M,0.0,150.0,0,0.0,5
llama-2-chat:7:ggufv2:Q3_K_M,0.0,150.0,0,0.0,5
Expand Down
86 changes: 43 additions & 43 deletions benchmark/results/processed/entity_selection.csv
Original file line number Diff line number Diff line change
@@ -1,47 +1,47 @@
Full model name,Score achieved,Score possible,Score SD,Accuracy,Iterations
gpt-3.5-turbo-0125,8.0,40.0,0,0.2,5
openhermes-2.5:7:ggufv2:Q6_K,8.0,40.0,0,0.2,5
openhermes-2.5:7:ggufv2:Q3_K_M,9.0,45.0,0,0.2,5
gpt-4o-2024-05-13,8.0,40.0,0,0.2,5
llama-3-instruct:8:ggufv2:Q4_K_M,7.0,36.0,0,0.19444444444444445,5
openhermes-2.5:7:ggufv2:Q8_0,8.0,45.0,0,0.17777777777777778,5
openhermes-2.5:7:ggufv2:Q5_K_M,8.0,45.0,0,0.17777777777777778,5
openhermes-2.5:7:ggufv2:Q4_K_M,8.0,45.0,0,0.17777777777777778,5
gpt-4-0613,8.0,45.0,0,0.17777777777777778,5
gpt-3.5-turbo-0613,8.0,45.0,0,0.17777777777777778,5
llama-3-instruct:8:ggufv2:Q8_0,7.0,40.0,0,0.175,5
llama-3-instruct:8:ggufv2:Q6_K,7.0,40.0,0,0.175,5
llama-3-instruct:8:ggufv2:Q5_K_M,7.0,40.0,0,0.175,5
gpt-4-0125-preview,7.0,45.0,0,0.15555555555555556,5
chatglm3:6:ggmlv3:q4_0,6.0,40.0,0,0.15,5
openhermes-2.5:7:ggufv2:Q2_K,5.0,45.0,0,0.1111111111111111,5
code-llama-instruct:7:ggufv2:Q3_K_M,4.0,40.0,0,0.1,5
mistral-instruct-v0.2:7:ggufv2:Q6_K,4.0,40.0,0,0.1,5
mixtral-instruct-v0.1:46_7:ggufv2:Q6_K,3.8,40.0,0,0.095,5
code-llama-instruct:13:ggufv2:Q3_K_M,3.6,40.0,0,0.09,5
llama-2-chat:70:ggufv2:Q5_K_M,4.0,45.0,0,0.08888888888888889,5
llama-2-chat:7:ggufv2:Q8_0,4.0,45.0,0,0.08888888888888889,5
llama-2-chat:70:ggufv2:Q4_K_M,4.0,45.0,0,0.08888888888888889,5
llama-2-chat:7:ggufv2:Q4_K_M,4.0,45.0,0,0.08888888888888889,5
llama-2-chat:7:ggufv2:Q5_K_M,4.0,45.0,0,0.08888888888888889,5
mistral-instruct-v0.2:7:ggufv2:Q5_K_M,4.0,45.0,0,0.08888888888888889,5
mixtral-instruct-v0.1:46_7:ggufv2:Q5_K_M,3.8,45.0,0,0.08444444444444445,5
llama-2-chat:7:ggufv2:Q6_K,3.0,40.0,0,0.075,5
mistral-instruct-v0.2:7:ggufv2:Q3_K_M,3.0,45.0,0,0.06666666666666667,5
mistral-instruct-v0.2:7:ggufv2:Q4_K_M,3.0,45.0,0,0.06666666666666667,5
llama-2-chat:7:ggufv2:Q3_K_M,3.0,45.0,0,0.06666666666666667,5
mixtral-instruct-v0.1:46_7:ggufv2:Q3_K_M,3.0,45.0,0,0.06666666666666667,5
mixtral-instruct-v0.1:46_7:ggufv2:Q4_K_M,3.0,45.0,0,0.06666666666666667,5
mistral-instruct-v0.2:7:ggufv2:Q8_0,3.0,45.0,0,0.06666666666666667,5
code-llama-instruct:7:ggufv2:Q4_K_M,3.0,45.0,0,0.06666666666666667,5
llama-2-chat:70:ggufv2:Q3_K_M,3.0,45.0,0,0.06666666666666667,5
mixtral-instruct-v0.1:46_7:ggufv2:Q8_0,2.8,45.0,0,0.06222222222222222,5
code-llama-instruct:7:ggufv2:Q2_K,2.0,40.0,0,0.05,5
code-llama-instruct:34:ggufv2:Q8_0,2.0,40.0,0,0.05,5
mistral-instruct-v0.2:7:ggufv2:Q2_K,2.0,45.0,0,0.044444444444444446,5
code-llama-instruct:34:ggufv2:Q6_K,1.0,40.0,0,0.025,5
code-llama-instruct:34:ggufv2:Q5_K_M,1.0,40.0,0,0.025,5
code-llama-instruct:7:ggufv2:Q5_K_M,1.0,45.0,0,0.022222222222222223,5
gpt-3.5-turbo-0125,40.0,40.0,0,1.0,5
openhermes-2.5:7:ggufv2:Q6_K,40.0,40.0,0,1.0,5
openhermes-2.5:7:ggufv2:Q3_K_M,45.0,45.0,0,1.0,5
gpt-4o-2024-05-13,40.0,40.0,0,1.0,5
openhermes-2.5:7:ggufv2:Q8_0,40.0,45.0,0,0.8888888888888888,5
openhermes-2.5:7:ggufv2:Q5_K_M,40.0,45.0,0,0.8888888888888888,5
openhermes-2.5:7:ggufv2:Q4_K_M,40.0,45.0,0,0.8888888888888888,5
gpt-4-0613,40.0,45.0,0,0.8888888888888888,5
gpt-3.5-turbo-0613,40.0,45.0,0,0.8888888888888888,5
llama-3-instruct:8:ggufv2:Q8_0,35.0,40.0,0,0.875,5
llama-3-instruct:8:ggufv2:Q6_K,35.0,40.0,0,0.875,5
llama-3-instruct:8:ggufv2:Q5_K_M,35.0,40.0,0,0.875,5
llama-3-instruct:8:ggufv2:Q4_K_M,31.0,36.0,0,0.8611111111111112,5
gpt-4-0125-preview,35.0,45.0,0,0.7777777777777778,5
chatglm3:6:ggmlv3:q4_0,30.0,40.0,0,0.75,5
openhermes-2.5:7:ggufv2:Q2_K,25.0,45.0,0,0.5555555555555556,5
code-llama-instruct:7:ggufv2:Q3_K_M,20.0,40.0,0,0.5,5
mistral-instruct-v0.2:7:ggufv2:Q6_K,20.0,40.0,0,0.5,5
mixtral-instruct-v0.1:46_7:ggufv2:Q6_K,19.0,40.0,0,0.475,5
code-llama-instruct:13:ggufv2:Q3_K_M,18.0,40.0,0,0.45,5
llama-2-chat:70:ggufv2:Q5_K_M,20.0,45.0,0,0.4444444444444444,5
llama-2-chat:7:ggufv2:Q8_0,20.0,45.0,0,0.4444444444444444,5
llama-2-chat:70:ggufv2:Q4_K_M,20.0,45.0,0,0.4444444444444444,5
llama-2-chat:7:ggufv2:Q4_K_M,20.0,45.0,0,0.4444444444444444,5
llama-2-chat:7:ggufv2:Q5_K_M,20.0,45.0,0,0.4444444444444444,5
mistral-instruct-v0.2:7:ggufv2:Q5_K_M,20.0,45.0,0,0.4444444444444444,5
mixtral-instruct-v0.1:46_7:ggufv2:Q5_K_M,19.0,45.0,0,0.4222222222222222,5
llama-2-chat:7:ggufv2:Q6_K,15.0,40.0,0,0.375,5
mistral-instruct-v0.2:7:ggufv2:Q3_K_M,15.0,45.0,0,0.3333333333333333,5
mistral-instruct-v0.2:7:ggufv2:Q4_K_M,15.0,45.0,0,0.3333333333333333,5
llama-2-chat:7:ggufv2:Q3_K_M,15.0,45.0,0,0.3333333333333333,5
mixtral-instruct-v0.1:46_7:ggufv2:Q3_K_M,15.0,45.0,0,0.3333333333333333,5
mixtral-instruct-v0.1:46_7:ggufv2:Q4_K_M,15.0,45.0,0,0.3333333333333333,5
mistral-instruct-v0.2:7:ggufv2:Q8_0,15.0,45.0,0,0.3333333333333333,5
code-llama-instruct:7:ggufv2:Q4_K_M,15.0,45.0,0,0.3333333333333333,5
llama-2-chat:70:ggufv2:Q3_K_M,15.0,45.0,0,0.3333333333333333,5
mixtral-instruct-v0.1:46_7:ggufv2:Q8_0,14.0,45.0,0,0.3111111111111111,5
code-llama-instruct:7:ggufv2:Q2_K,10.0,40.0,0,0.25,5
code-llama-instruct:34:ggufv2:Q8_0,10.0,40.0,0,0.25,5
mistral-instruct-v0.2:7:ggufv2:Q2_K,10.0,45.0,0,0.2222222222222222,5
code-llama-instruct:34:ggufv2:Q6_K,5.0,40.0,0,0.125,5
code-llama-instruct:34:ggufv2:Q5_K_M,5.0,40.0,0,0.125,5
code-llama-instruct:7:ggufv2:Q5_K_M,5.0,45.0,0,0.1111111111111111,5
mixtral-instruct-v0.1:46_7:ggufv2:Q2_K,0.0,45.0,0,0.0,5
code-llama-instruct:7:ggufv2:Q6_K,0.0,40.0,0,0.0,5
code-llama-instruct:13:ggufv2:Q4_K_M,0.0,40.0,0,0.0,5
Expand Down
116 changes: 58 additions & 58 deletions benchmark/results/processed/explicit_relevance_of_single_fragments.csv
Original file line number Diff line number Diff line change
@@ -1,62 +1,62 @@
Full model name,Score achieved,Score possible,Score SD,Accuracy,Iterations
llama-2-chat:70:ggufv2:Q3_K_M,6.0,30.0,0,0.2,5
llama-3-instruct:8:ggufv2:Q6_K,6.0,30.0,0,0.2,5
llama-2-chat:13:ggufv2:Q8_0,6.0,30.0,0,0.2,5
llama-2-chat:70:ggufv2:Q2_K,6.0,30.0,0,0.2,5
llama-2-chat:70:ggufv2:Q4_K_M,6.0,30.0,0,0.2,5
llama-2-chat:70:ggufv2:Q5_K_M,6.0,30.0,0,0.2,5
llama-2-chat:7:ggufv2:Q3_K_M,6.0,30.0,0,0.2,5
llama-2-chat:7:ggufv2:Q4_K_M,6.0,30.0,0,0.2,5
llama-2-chat:7:ggufv2:Q5_K_M,6.0,30.0,0,0.2,5
llama-2-chat:7:ggufv2:Q6_K,6.0,30.0,0,0.2,5
llama-2-chat:7:ggufv2:Q8_0,6.0,30.0,0,0.2,5
llama-3-instruct:8:ggufv2:Q4_K_M,6.0,30.0,0,0.2,5
llama-3-instruct:8:ggufv2:Q5_K_M,6.0,30.0,0,0.2,5
llama-3-instruct:8:ggufv2:Q8_0,6.0,30.0,0,0.2,5
llama-2-chat:13:ggufv2:Q5_K_M,6.0,30.0,0,0.2,5
mistral-instruct-v0.2:7:ggufv2:Q2_K,6.0,30.0,0,0.2,5
mistral-instruct-v0.2:7:ggufv2:Q3_K_M,6.0,30.0,0,0.2,5
mistral-instruct-v0.2:7:ggufv2:Q4_K_M,6.0,30.0,0,0.2,5
mistral-instruct-v0.2:7:ggufv2:Q5_K_M,6.0,30.0,0,0.2,5
mistral-instruct-v0.2:7:ggufv2:Q6_K,6.0,30.0,0,0.2,5
mistral-instruct-v0.2:7:ggufv2:Q8_0,6.0,30.0,0,0.2,5
openhermes-2.5:7:ggufv2:Q2_K,6.0,30.0,0,0.2,5
openhermes-2.5:7:ggufv2:Q3_K_M,6.0,30.0,0,0.2,5
openhermes-2.5:7:ggufv2:Q4_K_M,6.0,30.0,0,0.2,5
openhermes-2.5:7:ggufv2:Q5_K_M,6.0,30.0,0,0.2,5
openhermes-2.5:7:ggufv2:Q6_K,6.0,30.0,0,0.2,5
llama-2-chat:13:ggufv2:Q6_K,6.0,30.0,0,0.2,5
openhermes-2.5:7:ggufv2:Q8_0,6.0,30.0,0,0.2,5
llama-2-chat:13:ggufv2:Q4_K_M,6.0,30.0,0,0.2,5
llama-2-chat:13:ggufv2:Q3_K_M,6.0,30.0,0,0.2,5
llama-2-chat:13:ggufv2:Q2_K,6.0,30.0,0,0.2,5
gpt-4o-2024-05-13,6.0,30.0,0,0.2,5
gpt-4-0613,6.0,30.0,0,0.2,5
gpt-4-0125-preview,6.0,30.0,0,0.2,5
gpt-3.5-turbo-0613,6.0,30.0,0,0.2,5
gpt-3.5-turbo-0125,6.0,30.0,0,0.2,5
code-llama-instruct:7:ggufv2:Q8_0,6.0,30.0,0,0.2,5
code-llama-instruct:7:ggufv2:Q4_K_M,6.0,30.0,0,0.2,5
code-llama-instruct:13:ggufv2:Q6_K,5.0,30.0,0,0.16666666666666666,5
llama-2-chat:7:ggufv2:Q2_K,5.0,30.0,0,0.16666666666666666,5
code-llama-instruct:7:ggufv2:Q6_K,5.0,30.0,0,0.16666666666666666,5
code-llama-instruct:7:ggufv2:Q5_K_M,5.0,30.0,0,0.16666666666666666,5
code-llama-instruct:7:ggufv2:Q3_K_M,5.0,30.0,0,0.16666666666666666,5
code-llama-instruct:13:ggufv2:Q8_0,5.0,30.0,0,0.16666666666666666,5
chatglm3:6:ggmlv3:q4_0,4.4,30.0,0,0.14666666666666667,5
code-llama-instruct:13:ggufv2:Q5_K_M,4.0,30.0,0,0.13333333333333333,5
code-llama-instruct:34:ggufv2:Q4_K_M,3.0,30.0,0,0.1,5
code-llama-instruct:34:ggufv2:Q3_K_M,3.0,30.0,0,0.1,5
code-llama-instruct:34:ggufv2:Q2_K,3.0,30.0,0,0.1,5
code-llama-instruct:34:ggufv2:Q8_0,2.0,30.0,0,0.06666666666666667,5
code-llama-instruct:34:ggufv2:Q5_K_M,2.0,30.0,0,0.06666666666666667,5
code-llama-instruct:13:ggufv2:Q4_K_M,2.0,30.0,0,0.06666666666666667,5
code-llama-instruct:7:ggufv2:Q2_K,2.0,30.0,0,0.06666666666666667,5
code-llama-instruct:34:ggufv2:Q6_K,2.0,30.0,0,0.06666666666666667,5
mixtral-instruct-v0.1:46_7:ggufv2:Q2_K,2.0,30.0,0,0.06666666666666667,5
mixtral-instruct-v0.1:46_7:ggufv2:Q4_K_M,1.0,30.0,0,0.03333333333333333,5
mixtral-instruct-v0.1:46_7:ggufv2:Q8_0,0.8,30.0,0,0.02666666666666667,5
code-llama-instruct:13:ggufv2:Q2_K,0.2,30.0,0,0.006666666666666667,5
llama-2-chat:70:ggufv2:Q3_K_M,30.0,30.0,0,1.0,5
llama-3-instruct:8:ggufv2:Q6_K,30.0,30.0,0,1.0,5
llama-2-chat:13:ggufv2:Q8_0,30.0,30.0,0,1.0,5
llama-2-chat:70:ggufv2:Q2_K,30.0,30.0,0,1.0,5
llama-2-chat:70:ggufv2:Q4_K_M,30.0,30.0,0,1.0,5
llama-2-chat:70:ggufv2:Q5_K_M,30.0,30.0,0,1.0,5
llama-2-chat:7:ggufv2:Q3_K_M,30.0,30.0,0,1.0,5
llama-2-chat:7:ggufv2:Q4_K_M,30.0,30.0,0,1.0,5
llama-2-chat:7:ggufv2:Q5_K_M,30.0,30.0,0,1.0,5
llama-2-chat:7:ggufv2:Q6_K,30.0,30.0,0,1.0,5
llama-2-chat:7:ggufv2:Q8_0,30.0,30.0,0,1.0,5
llama-3-instruct:8:ggufv2:Q4_K_M,30.0,30.0,0,1.0,5
llama-3-instruct:8:ggufv2:Q5_K_M,30.0,30.0,0,1.0,5
llama-3-instruct:8:ggufv2:Q8_0,30.0,30.0,0,1.0,5
llama-2-chat:13:ggufv2:Q5_K_M,30.0,30.0,0,1.0,5
mistral-instruct-v0.2:7:ggufv2:Q2_K,30.0,30.0,0,1.0,5
mistral-instruct-v0.2:7:ggufv2:Q3_K_M,30.0,30.0,0,1.0,5
mistral-instruct-v0.2:7:ggufv2:Q4_K_M,30.0,30.0,0,1.0,5
mistral-instruct-v0.2:7:ggufv2:Q5_K_M,30.0,30.0,0,1.0,5
mistral-instruct-v0.2:7:ggufv2:Q6_K,30.0,30.0,0,1.0,5
mistral-instruct-v0.2:7:ggufv2:Q8_0,30.0,30.0,0,1.0,5
openhermes-2.5:7:ggufv2:Q2_K,30.0,30.0,0,1.0,5
openhermes-2.5:7:ggufv2:Q3_K_M,30.0,30.0,0,1.0,5
openhermes-2.5:7:ggufv2:Q4_K_M,30.0,30.0,0,1.0,5
openhermes-2.5:7:ggufv2:Q5_K_M,30.0,30.0,0,1.0,5
openhermes-2.5:7:ggufv2:Q6_K,30.0,30.0,0,1.0,5
llama-2-chat:13:ggufv2:Q6_K,30.0,30.0,0,1.0,5
openhermes-2.5:7:ggufv2:Q8_0,30.0,30.0,0,1.0,5
llama-2-chat:13:ggufv2:Q4_K_M,30.0,30.0,0,1.0,5
llama-2-chat:13:ggufv2:Q3_K_M,30.0,30.0,0,1.0,5
llama-2-chat:13:ggufv2:Q2_K,30.0,30.0,0,1.0,5
gpt-4o-2024-05-13,30.0,30.0,0,1.0,5
gpt-4-0613,30.0,30.0,0,1.0,5
gpt-4-0125-preview,30.0,30.0,0,1.0,5
gpt-3.5-turbo-0613,30.0,30.0,0,1.0,5
gpt-3.5-turbo-0125,30.0,30.0,0,1.0,5
code-llama-instruct:7:ggufv2:Q8_0,30.0,30.0,0,1.0,5
code-llama-instruct:7:ggufv2:Q4_K_M,30.0,30.0,0,1.0,5
code-llama-instruct:13:ggufv2:Q6_K,25.0,30.0,0,0.8333333333333334,5
llama-2-chat:7:ggufv2:Q2_K,25.0,30.0,0,0.8333333333333334,5
code-llama-instruct:7:ggufv2:Q6_K,25.0,30.0,0,0.8333333333333334,5
code-llama-instruct:7:ggufv2:Q5_K_M,25.0,30.0,0,0.8333333333333334,5
code-llama-instruct:7:ggufv2:Q3_K_M,25.0,30.0,0,0.8333333333333334,5
code-llama-instruct:13:ggufv2:Q8_0,25.0,30.0,0,0.8333333333333334,5
chatglm3:6:ggmlv3:q4_0,22.0,30.0,0,0.7333333333333333,5
code-llama-instruct:13:ggufv2:Q5_K_M,20.0,30.0,0,0.6666666666666666,5
code-llama-instruct:34:ggufv2:Q4_K_M,15.0,30.0,0,0.5,5
code-llama-instruct:34:ggufv2:Q3_K_M,15.0,30.0,0,0.5,5
code-llama-instruct:34:ggufv2:Q2_K,15.0,30.0,0,0.5,5
code-llama-instruct:34:ggufv2:Q8_0,10.0,30.0,0,0.3333333333333333,5
code-llama-instruct:34:ggufv2:Q5_K_M,10.0,30.0,0,0.3333333333333333,5
code-llama-instruct:13:ggufv2:Q4_K_M,10.0,30.0,0,0.3333333333333333,5
code-llama-instruct:7:ggufv2:Q2_K,10.0,30.0,0,0.3333333333333333,5
code-llama-instruct:34:ggufv2:Q6_K,10.0,30.0,0,0.3333333333333333,5
mixtral-instruct-v0.1:46_7:ggufv2:Q2_K,10.0,30.0,0,0.3333333333333333,5
mixtral-instruct-v0.1:46_7:ggufv2:Q4_K_M,5.0,30.0,0,0.16666666666666666,5
mixtral-instruct-v0.1:46_7:ggufv2:Q8_0,4.0,30.0,0,0.13333333333333333,5
code-llama-instruct:13:ggufv2:Q2_K,1.0,30.0,0,0.03333333333333333,5
mixtral-instruct-v0.1:46_7:ggufv2:Q6_K,0.0,30.0,0,0.0,5
mixtral-instruct-v0.1:46_7:ggufv2:Q3_K_M,0.0,30.0,0,0.0,5
code-llama-instruct:13:ggufv2:Q3_K_M,0.0,30.0,0,0.0,5
Expand Down
Original file line number Diff line number Diff line change
@@ -1,63 +1,63 @@
Full model name,Score achieved,Score possible,Score SD,Accuracy,Iterations
chatglm3:6:ggmlv3:q4_0,2.0,10.0,0,0.2,5
mistral-instruct-v0.2:7:ggufv2:Q3_K_M,2.0,10.0,0,0.2,5
llama-2-chat:70:ggufv2:Q4_K_M,2.0,10.0,0,0.2,5
llama-2-chat:7:ggufv2:Q2_K,2.0,10.0,0,0.2,5
llama-2-chat:7:ggufv2:Q3_K_M,2.0,10.0,0,0.2,5
llama-3-instruct:8:ggufv2:Q4_K_M,2.0,10.0,0,0.2,5
llama-3-instruct:8:ggufv2:Q5_K_M,2.0,10.0,0,0.2,5
llama-3-instruct:8:ggufv2:Q6_K,2.0,10.0,0,0.2,5
llama-3-instruct:8:ggufv2:Q8_0,2.0,10.0,0,0.2,5
mistral-instruct-v0.2:7:ggufv2:Q4_K_M,2.0,10.0,0,0.2,5
gpt-3.5-turbo-0613,2.0,10.0,0,0.2,5
mistral-instruct-v0.2:7:ggufv2:Q5_K_M,2.0,10.0,0,0.2,5
mistral-instruct-v0.2:7:ggufv2:Q6_K,2.0,10.0,0,0.2,5
mixtral-instruct-v0.1:46_7:ggufv2:Q4_K_M,2.0,10.0,0,0.2,5
mixtral-instruct-v0.1:46_7:ggufv2:Q5_K_M,2.0,10.0,0,0.2,5
openhermes-2.5:7:ggufv2:Q4_K_M,2.0,10.0,0,0.2,5
openhermes-2.5:7:ggufv2:Q5_K_M,2.0,10.0,0,0.2,5
openhermes-2.5:7:ggufv2:Q6_K,2.0,10.0,0,0.2,5
gpt-4-0613,2.0,10.0,0,0.2,5
openhermes-2.5:7:ggufv2:Q8_0,2.0,10.0,0,0.2,5
code-llama-instruct:34:ggufv2:Q2_K,2.0,10.0,0,0.2,5
code-llama-instruct:34:ggufv2:Q5_K_M,2.0,10.0,0,0.2,5
code-llama-instruct:7:ggufv2:Q4_K_M,2.0,10.0,0,0.2,5
gpt-3.5-turbo-0125,1.8,10.0,0,0.18,5
code-llama-instruct:7:ggufv2:Q6_K,1.8,10.0,0,0.18,5
mistral-instruct-v0.2:7:ggufv2:Q8_0,1.8,10.0,0,0.18,5
code-llama-instruct:34:ggufv2:Q6_K,1.8,10.0,0,0.18,5
code-llama-instruct:34:ggufv2:Q8_0,1.8,10.0,0,0.18,5
llama-2-chat:70:ggufv2:Q5_K_M,1.8,10.0,0,0.18,5
mixtral-instruct-v0.1:46_7:ggufv2:Q6_K,1.4,10.0,0,0.13999999999999999,5
code-llama-instruct:7:ggufv2:Q3_K_M,1.4,10.0,0,0.13999999999999999,5
code-llama-instruct:7:ggufv2:Q2_K,1.4,10.0,0,0.13999999999999999,5
gpt-4o-2024-05-13,1.4,10.0,0,0.13999999999999999,5
llama-2-chat:7:ggufv2:Q5_K_M,1.2,10.0,0,0.12,5
mixtral-instruct-v0.1:46_7:ggufv2:Q8_0,1.2,10.0,0,0.12,5
mixtral-instruct-v0.1:46_7:ggufv2:Q2_K,1.2,10.0,0,0.12,5
code-llama-instruct:34:ggufv2:Q3_K_M,1.0,10.0,0,0.1,5
llama-2-chat:70:ggufv2:Q2_K,1.0,10.0,0,0.1,5
code-llama-instruct:13:ggufv2:Q4_K_M,1.0,10.0,0,0.1,5
code-llama-instruct:13:ggufv2:Q5_K_M,1.0,10.0,0,0.1,5
openhermes-2.5:7:ggufv2:Q3_K_M,1.0,10.0,0,0.1,5
openhermes-2.5:7:ggufv2:Q2_K,1.0,10.0,0,0.1,5
code-llama-instruct:7:ggufv2:Q8_0,1.0,10.0,0,0.1,5
code-llama-instruct:13:ggufv2:Q6_K,1.0,10.0,0,0.1,5
code-llama-instruct:13:ggufv2:Q8_0,1.0,10.0,0,0.1,5
mixtral-instruct-v0.1:46_7:ggufv2:Q3_K_M,1.0,10.0,0,0.1,5
llama-2-chat:13:ggufv2:Q2_K,1.0,10.0,0,0.1,5
llama-2-chat:70:ggufv2:Q3_K_M,1.0,10.0,0,0.1,5
llama-2-chat:13:ggufv2:Q3_K_M,1.0,10.0,0,0.1,5
mistral-instruct-v0.2:7:ggufv2:Q2_K,1.0,10.0,0,0.1,5
llama-2-chat:13:ggufv2:Q4_K_M,1.0,10.0,0,0.1,5
llama-2-chat:13:ggufv2:Q5_K_M,1.0,10.0,0,0.1,5
gpt-4-0125-preview,1.0,10.0,0,0.1,5
llama-2-chat:13:ggufv2:Q6_K,1.0,10.0,0,0.1,5
llama-2-chat:7:ggufv2:Q8_0,1.0,10.0,0,0.1,5
llama-2-chat:7:ggufv2:Q6_K,1.0,10.0,0,0.1,5
llama-2-chat:7:ggufv2:Q4_K_M,1.0,10.0,0,0.1,5
llama-2-chat:13:ggufv2:Q8_0,1.0,10.0,0,0.1,5
code-llama-instruct:7:ggufv2:Q5_K_M,1.0,10.0,0,0.1,5
code-llama-instruct:34:ggufv2:Q4_K_M,0.8,10.0,0,0.08,5
code-llama-instruct:13:ggufv2:Q2_K,0.8,10.0,0,0.08,5
chatglm3:6:ggmlv3:q4_0,10.0,10.0,0,1.0,5
mistral-instruct-v0.2:7:ggufv2:Q3_K_M,10.0,10.0,0,1.0,5
llama-2-chat:70:ggufv2:Q4_K_M,10.0,10.0,0,1.0,5
llama-2-chat:7:ggufv2:Q2_K,10.0,10.0,0,1.0,5
llama-2-chat:7:ggufv2:Q3_K_M,10.0,10.0,0,1.0,5
llama-3-instruct:8:ggufv2:Q4_K_M,10.0,10.0,0,1.0,5
llama-3-instruct:8:ggufv2:Q5_K_M,10.0,10.0,0,1.0,5
llama-3-instruct:8:ggufv2:Q6_K,10.0,10.0,0,1.0,5
llama-3-instruct:8:ggufv2:Q8_0,10.0,10.0,0,1.0,5
mistral-instruct-v0.2:7:ggufv2:Q4_K_M,10.0,10.0,0,1.0,5
gpt-3.5-turbo-0613,10.0,10.0,0,1.0,5
mistral-instruct-v0.2:7:ggufv2:Q5_K_M,10.0,10.0,0,1.0,5
mistral-instruct-v0.2:7:ggufv2:Q6_K,10.0,10.0,0,1.0,5
mixtral-instruct-v0.1:46_7:ggufv2:Q4_K_M,10.0,10.0,0,1.0,5
mixtral-instruct-v0.1:46_7:ggufv2:Q5_K_M,10.0,10.0,0,1.0,5
openhermes-2.5:7:ggufv2:Q4_K_M,10.0,10.0,0,1.0,5
openhermes-2.5:7:ggufv2:Q5_K_M,10.0,10.0,0,1.0,5
openhermes-2.5:7:ggufv2:Q6_K,10.0,10.0,0,1.0,5
gpt-4-0613,10.0,10.0,0,1.0,5
openhermes-2.5:7:ggufv2:Q8_0,10.0,10.0,0,1.0,5
code-llama-instruct:34:ggufv2:Q2_K,10.0,10.0,0,1.0,5
code-llama-instruct:34:ggufv2:Q5_K_M,10.0,10.0,0,1.0,5
code-llama-instruct:7:ggufv2:Q4_K_M,10.0,10.0,0,1.0,5
gpt-3.5-turbo-0125,9.0,10.0,0,0.9,5
code-llama-instruct:7:ggufv2:Q6_K,9.0,10.0,0,0.9,5
mistral-instruct-v0.2:7:ggufv2:Q8_0,9.0,10.0,0,0.9,5
code-llama-instruct:34:ggufv2:Q6_K,9.0,10.0,0,0.9,5
code-llama-instruct:34:ggufv2:Q8_0,9.0,10.0,0,0.9,5
llama-2-chat:70:ggufv2:Q5_K_M,9.0,10.0,0,0.9,5
mixtral-instruct-v0.1:46_7:ggufv2:Q6_K,7.0,10.0,0,0.7,5
code-llama-instruct:7:ggufv2:Q3_K_M,7.0,10.0,0,0.7,5
code-llama-instruct:7:ggufv2:Q2_K,7.0,10.0,0,0.7,5
gpt-4o-2024-05-13,7.0,10.0,0,0.7,5
llama-2-chat:7:ggufv2:Q5_K_M,6.0,10.0,0,0.6,5
mixtral-instruct-v0.1:46_7:ggufv2:Q8_0,6.0,10.0,0,0.6,5
mixtral-instruct-v0.1:46_7:ggufv2:Q2_K,6.0,10.0,0,0.6,5
code-llama-instruct:34:ggufv2:Q3_K_M,5.0,10.0,0,0.5,5
llama-2-chat:70:ggufv2:Q2_K,5.0,10.0,0,0.5,5
code-llama-instruct:13:ggufv2:Q4_K_M,5.0,10.0,0,0.5,5
code-llama-instruct:13:ggufv2:Q5_K_M,5.0,10.0,0,0.5,5
openhermes-2.5:7:ggufv2:Q3_K_M,5.0,10.0,0,0.5,5
openhermes-2.5:7:ggufv2:Q2_K,5.0,10.0,0,0.5,5
code-llama-instruct:7:ggufv2:Q8_0,5.0,10.0,0,0.5,5
code-llama-instruct:13:ggufv2:Q6_K,5.0,10.0,0,0.5,5
code-llama-instruct:13:ggufv2:Q8_0,5.0,10.0,0,0.5,5
mixtral-instruct-v0.1:46_7:ggufv2:Q3_K_M,5.0,10.0,0,0.5,5
llama-2-chat:13:ggufv2:Q2_K,5.0,10.0,0,0.5,5
llama-2-chat:70:ggufv2:Q3_K_M,5.0,10.0,0,0.5,5
llama-2-chat:13:ggufv2:Q3_K_M,5.0,10.0,0,0.5,5
mistral-instruct-v0.2:7:ggufv2:Q2_K,5.0,10.0,0,0.5,5
llama-2-chat:13:ggufv2:Q4_K_M,5.0,10.0,0,0.5,5
llama-2-chat:13:ggufv2:Q5_K_M,5.0,10.0,0,0.5,5
gpt-4-0125-preview,5.0,10.0,0,0.5,5
llama-2-chat:13:ggufv2:Q6_K,5.0,10.0,0,0.5,5
llama-2-chat:7:ggufv2:Q8_0,5.0,10.0,0,0.5,5
llama-2-chat:7:ggufv2:Q6_K,5.0,10.0,0,0.5,5
llama-2-chat:7:ggufv2:Q4_K_M,5.0,10.0,0,0.5,5
llama-2-chat:13:ggufv2:Q8_0,5.0,10.0,0,0.5,5
code-llama-instruct:7:ggufv2:Q5_K_M,5.0,10.0,0,0.5,5
code-llama-instruct:34:ggufv2:Q4_K_M,4.0,10.0,0,0.4,5
code-llama-instruct:13:ggufv2:Q2_K,4.0,10.0,0,0.4,5
code-llama-instruct:13:ggufv2:Q3_K_M,0.0,10.0,0,0.0,5
Loading

0 comments on commit 9a7d2b5

Please sign in to comment.