Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Layer skipping/self-speculation demo #3565

Draft
wants to merge 6 commits into
base: master
Choose a base branch
from

Conversation

KerfuffleV2
Copy link
Collaborator

This is a demo of skipping (or potentially repeating/reordering) layer evaluation when running a model.

This might sound like a weird thing to want to do, but it's mainly to enable research. For example, k-quants try to devote more bits to some layers compared to others since quantizing some layers affect the accuracy of the output more than others.

Another potential use is self-speculative decoding (#3435) - basically speculative decoding where the helper model isn't a separate smaller model but the same one run with some layers skipped. But the first thing you need to figure out to be able to do that is which layers you can skip and still get a result accurate enough for it to be used for speculation.

From the llama.cpp side, this is just a sketch of what the API could look like. It's also only implemented for LLaMa models right now. The list of layer indexes to run is supplied in the batch. If it's NULL then all layers run like normal. When set, there aren't really any requirements for what you can put in there so you could use it to run the first layer 10 times if you wanted, or whatever.

Also included is a hacked version of perplexity that just runs the first 10 chunks, then skips a layer. I.E. The first 10 chunks skip no layers, then it repeats with layer 0 skipped, then with layer 1 skipped, etc. Apps can't query how many layers exist in a model yet as far as I know so this is hardcoded to 26 (the number of layers in the model I was testing with). If you want to try it with a different model, just set n_layers to the correct value.

Example output with a 3B OpenOrca model:

expand
perplexity: 0.46 seconds per pass - ETA 4.72 minutes
[1]5.8199,[2]9.7619,[3]8.8527,[4]10.3179,[5]10.3419,[6]11.4852,[7]12.3442,[8]12.9492,[9]13.5509,[10]14.0479,[11]14.2477,[12]14.3440,[13]14.6437,[14]15.7930,[15]14.9222,[16]14.3629,[17]14.1856,[18]13.2702,[19]13.2204,[20]12.9117,
SKIPPING: 0
[1]25.2163,[2]51.9355,[3]56.3117,[4]52.2362,[5]42.7974,[6]44.8002,[7]46.5999,[8]44.7453,[9]46.3201,[10]49.1452,[11]46.8407,[12]45.6636,[13]47.8135,[14]50.3035,[15]47.6079,[16]45.8225,[17]46.6676,[18]42.8317,[19]44.0725,[20]44.3559,
SKIPPING: 1
[1]7.0817,[2]11.2318,[3]10.6045,[4]12.3341,[5]12.0341,[6]13.2317,[7]14.3178,[8]14.7812,[9]15.4714,[10]16.2621,[11]16.3338,[12]16.5096,[13]16.8677,[14]18.1778,[15]17.0847,[16]16.3890,[17]16.3478,[18]15.2399,[19]15.1692,[20]14.7721,
SKIPPING: 2
[1]8.6231,[2]14.9235,[3]15.4307,[4]18.6430,[5]19.6522,[6]21.7925,[7]23.3335,[8]23.3390,[9]24.2896,[10]26.4260,[11]26.1471,[12]25.9729,[13]26.3351,[14]28.1945,[15]26.2176,[16]25.0092,[17]24.8536,[18]22.9061,[19]22.6791,[20]22.1606,
SKIPPING: 3
[1]6.3139,[2]10.2253,[3]9.5598,[4]10.8663,[5]10.8262,[6]11.9944,[7]13.0495,[8]13.4974,[9]14.0970,[10]14.6634,[11]14.7197,[12]14.8650,[13]15.2139,[14]16.3909,[15]15.4614,[16]14.8665,[17]14.8036,[18]13.8644,[19]13.8512,[20]13.5506,
SKIPPING: 4
[1]6.3025,[2]10.6130,[3]9.9635,[4]11.3564,[5]11.2259,[6]12.3626,[7]13.1142,[8]13.6573,[9]14.1933,[10]14.7734,[11]14.8495,[12]14.9516,[13]15.2244,[14]16.3395,[15]15.4234,[16]14.8690,[17]14.7760,[18]13.7837,[19]13.7449,[20]13.4942,
SKIPPING: 5
[1]6.2125,[2]10.0932,[3]9.3480,[4]10.7340,[5]10.6881,[6]11.8967,[7]12.8075,[8]13.2789,[9]13.9388,[10]14.5423,[11]14.7062,[12]14.8974,[13]15.2723,[14]16.4320,[15]15.5900,[16]15.0547,[17]14.9314,[18]13.9600,[19]13.9331,[20]13.6405,
SKIPPING: 6
[1]5.9851,[2]10.5281,[3]9.9157,[4]11.6017,[5]11.6682,[6]12.8976,[7]13.6495,[8]14.0625,[9]14.7934,[10]15.4462,[11]15.5003,[12]15.5742,[13]15.9931,[14]17.1831,[15]16.3347,[16]15.7159,[17]15.5338,[18]14.6000,[19]14.5295,[20]14.2197,
SKIPPING: 7
[1]6.1259,[2]9.9460,[3]9.2572,[4]10.8360,[5]11.0077,[6]12.0282,[7]12.9301,[8]13.4200,[9]14.2473,[10]14.7883,[11]14.9213,[12]15.0634,[13]15.4026,[14]16.5222,[15]15.6139,[16]15.0328,[17]14.9715,[18]14.0204,[19]14.0256,[20]13.6860,
SKIPPING: 8
[1]5.9687,[2]10.0450,[3]9.2705,[4]10.8358,[5]10.9145,[6]12.0899,[7]12.8708,[8]13.1910,[9]13.8345,[10]14.4637,[11]14.5892,[12]14.7250,[13]14.9323,[14]16.0804,[15]15.2191,[16]14.6529,[17]14.5418,[18]13.6767,[19]13.7031,[20]13.4121,
SKIPPING: 9
[1]6.9007,[2]10.6571,[3]9.9226,[4]11.9619,[5]12.0004,[6]12.9418,[7]13.7968,[8]14.1802,[9]15.0244,[10]15.5724,[11]15.7054,[12]15.7443,[13]16.2122,[14]17.4440,[15]16.5232,[16]15.9458,[17]15.8428,[18]14.9867,[19]14.9257,[20]14.6624,
SKIPPING: 10
[1]6.7959,[2]10.6265,[3]9.7182,[4]11.3150,[5]11.3993,[6]12.5715,[7]13.3607,[8]13.7556,[9]14.4518,[10]14.9865,[11]15.0569,[12]15.2553,[13]15.6216,[14]16.7313,[15]15.8374,[16]15.2221,[17]15.0922,[18]14.2173,[19]14.2001,[20]13.8791,
SKIPPING: 11
[1]6.1300,[2]9.9418,[3]9.3706,[4]10.9860,[5]10.9859,[6]12.1049,[7]13.0305,[8]13.5025,[9]14.2297,[10]14.7582,[11]14.9002,[12]15.0006,[13]15.2878,[14]16.4456,[15]15.6193,[16]14.9547,[17]14.8007,[18]13.8861,[19]13.8148,[20]13.4797,
SKIPPING: 12
[1]6.2256,[2]10.3569,[3]9.5679,[4]11.1311,[5]11.1166,[6]12.3105,[7]13.0932,[8]13.6650,[9]14.3377,[10]14.8129,[11]14.9030,[12]15.0153,[13]15.2636,[14]16.4425,[15]15.5658,[16]14.9760,[17]14.7997,[18]13.9013,[19]13.8328,[20]13.5045,
SKIPPING: 13
[1]7.0302,[2]11.2369,[3]10.1856,[4]12.0675,[5]12.0623,[6]13.4295,[7]14.3983,[8]14.9713,[9]15.6726,[10]16.2047,[11]16.2730,[12]16.4301,[13]16.7189,[14]17.9696,[15]17.0045,[16]16.2982,[17]16.0179,[18]14.9335,[19]14.8596,[20]14.4760,
SKIPPING: 14
[1]5.9154,[2]9.4330,[3]8.7158,[4]10.0449,[5]10.1686,[6]11.2391,[7]12.0962,[8]12.5760,[9]13.1994,[10]13.7971,[11]13.9700,[12]14.0707,[13]14.3820,[14]15.4446,[15]14.6040,[16]14.0533,[17]13.9758,[18]13.0747,[19]13.0260,[20]12.7179,
SKIPPING: 15
[1]6.0744,[2]9.9528,[3]9.1602,[4]10.9263,[5]10.9065,[6]11.9588,[7]12.7723,[8]13.3218,[9]13.9382,[10]14.5620,[11]14.7441,[12]14.8494,[13]15.1715,[14]16.3524,[15]15.4373,[16]14.8253,[17]14.7113,[18]13.7264,[19]13.7037,[20]13.4134,
SKIPPING: 16
[1]7.3893,[2]12.0937,[3]10.9386,[4]12.3974,[5]12.2207,[6]13.6213,[7]14.3570,[8]14.9508,[9]15.6288,[10]16.0710,[11]16.1968,[12]16.4038,[13]16.8972,[14]18.2231,[15]17.2456,[16]16.4961,[17]16.3655,[18]15.3156,[19]15.2063,[20]14.9209,
SKIPPING: 17
[1]6.0448,[2]10.6920,[3]9.9614,[4]11.8648,[5]11.7425,[6]12.9460,[7]13.8928,[8]14.4277,[9]15.1083,[10]15.6226,[11]15.8729,[12]15.9144,[13]16.2883,[14]17.5557,[15]16.5954,[16]15.9410,[17]15.7618,[18]14.6961,[19]14.6226,[20]14.2913,
SKIPPING: 18
[1]6.2872,[2]10.4239,[3]9.5514,[4]11.1965,[5]11.2634,[6]12.5786,[7]13.6169,[8]14.2559,[9]14.8788,[10]15.4762,[11]15.6122,[12]15.7212,[13]16.0137,[14]17.2914,[15]16.3324,[16]15.6442,[17]15.5229,[18]14.5262,[19]14.4437,[20]14.1009,
SKIPPING: 19
[1]6.8102,[2]10.9868,[3]10.0126,[4]11.3141,[5]11.2235,[6]12.1969,[7]13.0746,[8]13.7327,[9]14.2543,[10]14.7410,[11]14.8813,[12]15.0299,[13]15.4559,[14]16.7808,[15]15.9159,[16]15.2932,[17]15.1479,[18]14.2357,[19]14.2339,[20]13.9115,
SKIPPING: 20
[1]6.3342,[2]11.3291,[3]10.1610,[4]11.7932,[5]11.6527,[6]12.9673,[7]13.7810,[8]14.4919,[9]15.1551,[10]15.7327,[11]16.0265,[12]16.1517,[13]16.4890,[14]17.7187,[15]16.7280,[16]16.1573,[17]16.0293,[18]15.0224,[19]14.9706,[20]14.7113,
SKIPPING: 21
[1]6.3091,[2]10.8477,[3]10.0105,[4]11.5558,[5]11.4888,[6]12.7616,[7]13.7980,[8]14.2681,[9]14.8824,[10]15.4924,[11]15.6655,[12]15.7857,[13]16.2194,[14]17.4920,[15]16.5037,[16]15.8636,[17]15.7023,[18]14.6775,[19]14.5632,[20]14.1849,
SKIPPING: 22
[1]6.3004,[2]10.6902,[3]9.8396,[4]11.2673,[5]11.2273,[6]12.5295,[7]13.4752,[8]14.1604,[9]14.6935,[10]15.1948,[11]15.4120,[12]15.4805,[13]15.7743,[14]16.9816,[15]16.0597,[16]15.4504,[17]15.2674,[18]14.3224,[19]14.2894,[20]13.9125,
SKIPPING: 23
[1]6.3650,[2]10.2259,[3]9.9501,[4]11.2861,[5]11.2926,[6]12.2607,[7]13.0683,[8]13.6316,[9]14.1852,[10]14.7976,[11]14.9753,[12]15.0440,[13]15.3573,[14]16.5402,[15]15.6355,[16]15.0586,[17]14.9775,[18]14.0236,[19]13.9416,[20]13.6134,
SKIPPING: 24
[1]6.3187,[2]10.4594,[3]9.6496,[4]11.2578,[5]11.3070,[6]12.5010,[7]13.4481,[8]14.1364,[9]14.7356,[10]15.3568,[11]15.5718,[12]15.7672,[13]16.2686,[14]17.6082,[15]16.6596,[16]16.0976,[17]15.9703,[18]14.9279,[19]14.8169,[20]14.4634,
SKIPPING: 25
[1]10.1270,[2]18.9880,[3]16.8708,[4]19.6677,[5]19.2827,[6]21.7927,[7]23.4253,[8]24.9245,[9]25.8550,[10]26.8885,[11]27.4006,[12]27.1863,[13]28.2399,[14]30.2889,[15]28.3443,[16]26.9676,[17]26.7916,[18]24.7348,[19]24.7016,[20]24.2101,

@KerfuffleV2 KerfuffleV2 added research 🔬 demo Demonstrate some concept or idea, not intended to be merged labels Oct 10, 2023
@KerfuffleV2
Copy link
Collaborator Author

Here's a bit more information about the results:

>>> x = ((0,44.3559),(1,14.7721),(2,22.1606),(3,13.5506),(4,13.4942),(5,13.6405),(6,14.2197),(7,13.6860),(8,13.4121),(9,14.6624),(10,13.8791),(11,13.4797),(12,13.5045),(13,14.4760),(14,12.7179),(15,13.4134),(16,14.9209),(17,14.2913),(18,14.1009),(19,13.9115),(20,14.7113),(21,14.1849),(22,13.9125),(23,13.6134),(24,14.4634),(25,24.2101))
>>> tuple(z[0] for z in sorted(x, key = lambda z: z[1]))
(14, 8, 15, 11, 4, 12, 3, 23, 5, 7, 10, 19, 22, 18, 21, 6, 17, 24, 13, 9, 20, 1, 16, 2, 25, 0)
>>> tuple('{0:.3}'.format(z[1] - 12.9117) for z in sorted(x, key = lambda z: z[1]))
('-0.194', '0.5', '0.502', '0.568', '0.582', '0.593', '0.639', '0.702', '0.729', '0.774', '0.967', '1.0', '1.0', '1.19', '1.27', '1.31', '1.38', '1.55', '1.56', '1.75', '1.8', '1.86', '2.01', '9.25', '11.3', '31.4')
>>> print('\n'.join(tuple('{0:2}|{1:.3}'.format(z[0], z[1] - 12.9117) for z in sorted(x, key = lambda z: z[1]))))

In friendlier table format:

skipped ppl change
14 -0.194
8 0.5
15 0.502
11 0.568
4 0.582
12 0.593
3 0.639
23 0.702
5 0.729
7 0.774
10 0.967
19 1.0
22 1.0
18 1.19
21 1.27
6 1.31
17 1.38
24 1.55
13 1.56
9 1.75
20 1.8
1 1.86
16 2.01
2 9.25
25 11.3
0 31.4

If we were planning on skipping half the layers, skipping the ones in the list up to and including 22 looks promising. I haven't done any experiments with skipping multiple layers yet. Also, this is just a sketch of the process not necessarily actual usable results since it's only running 20 chunks of perplexity to come up with that which isn't necessarily representative.

@ggerganov
Copy link
Owner

I am very interested in generating the same table for LLaMA v1 7B and LLaMA v2 7B.
Want to see if this analysis will detect layer 1 to have the biggest impact for LLaMA v2 7B (#2421)

@KerfuffleV2
Copy link
Collaborator Author

KerfuffleV2 commented Oct 10, 2023

I asked the authors of the self speculation paper about which layers they found to be best for skipping (that's with a 13B LLaMA2 model with 80 layers). This is what they said:

"[...] You can use attention layers [3, 5, 6, 8, 10, 11, 14, 15, 18, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 33, 34, 35, 36, 37] and MLP layers [6, 9, 10, 11, 15, 24, 25, 27, 28, 35] as a starting point [...]"

Talking about skipping attention vs MLP layers separately is probably a good indication that my current approach isn't fine-grained enough. I really don't know how to skip those parts separately, rather than just passing the entire layer. It doesn't seem like the skip everything approach can get anything close to reasonable results when skipping more than 10-20% of layers.

I am very interested in generating the same table for LLaMA v1 7B and LLaMA v2 7B.

This isn't exactly what you asked for, but I ran some tests with a 7B Mistral OpenOrca model (32 layers).

Skip# Last skip Ppl diff
0 +0.0000 (~9.1)
1 9 +0.2705
2 14 +0.8020
3 13 +1.3737
4 25 +2.3158
5 22 +3.6185
6 10 +5.1147
7 27 +7.1322
8 24 +9.1707
9 23 +12.1527
10 8 +15.9593
11 20 +22.4803
12 15 +33.8737
13 12 +51.6857

First column is the number of skipped layers, second is the last skip layer added, last column is the ppl difference compared to not skipping any layers. This is also just running 15 chunks. Just for example, at Skip# 3, that means 9,14,13 were skipped.

I made some changes to the perplexity example to iterate over the layers and find the one which results in the lowest ppl, then add that as a permanent skip and then repeat the process. The code is horrendous, but I'll push it in case anyone wants to try it out.


edit:

Want to see if this analysis will detect layer 1 to have the biggest impact

If you mean the first layer (my stuff prints out the layer index so that would be layer 0 here) then yes, definitely. Skipping it increases perplexity by like 3000. Layers at indexes 0, 1 have a massive impact (especially 0). The last layer doesn't seem very skippable either, it'll basically double perplexity.

@KerfuffleV2
Copy link
Collaborator Author

Here's the full output for the first past through the Mistral 7B model:

Expand
[1]4.5560,[2]5.4661,[3]6.2772,[4]7.5924,[5]7.6146,[6]7.6872,[7]7.9912,[8]8.1924,[9]8.4290,[10]8.8039,[11]9.1807,[12]9.1607,[13]9.2799,[14]9.2800,[15]9.1242,
SKIP   0 + [] - len:   1, best:( -1: 0.000)
[1]1571.1876,[2]3386.7620,[3]2810.8186,[4]3871.2885,[5]3236.7270,[6]2859.0309,[7]2576.3551,[8]2687.2615,[9]2762.2076,[10]2967.1324,[11]3098.3210,[12]2941.1502,[13]3072.2781,[14]3272.7208,[15]3226.6815,
SKIP   1 + [] - len:   1, best:(  0: 3217.557)
[1]98.1369,[2]148.4032,[3]153.1096,[4]184.7184,[5]185.0628,[6]186.4683,[7]184.9066,[8]187.2498,[9]206.0430,[10]228.1347,[11]240.4059,[12]235.5293,[13]240.6097,[14]240.1005,[15]247.8182,
SKIP   2 + [] - len:   1, best:(  1: 238.694)
[1]5.2162,[2]6.2640,[3]7.4478,[4]9.0383,[5]8.7124,[6]8.7353,[7]9.1002,[8]9.2899,[9]9.6851,[10]10.2463,[11]10.6458,[12]10.5753,[13]10.7060,[14]10.6431,[15]10.5116,
SKIP   3 + [] - len:   1, best:(  2: 1.387)
[1]4.7446,[2]5.8788,[3]6.6885,[4]7.9392,[5]7.9364,[6]7.9339,[7]8.2131,[8]8.4421,[9]8.7396,[10]9.1717,[11]9.5760,[12]9.5431,[13]9.6631,[14]9.7524,[15]9.6272,
SKIP   4 + [] - len:   1, best:(  3: 0.503)
[1]4.8223,[2]5.8301,[3]6.7629,[4]8.0221,[5]8.1903,[6]8.2384,[7]8.4826,[8]8.7170,[9]8.9832,[10]9.4339,[11]9.8143,[12]9.7113,[13]9.9347,[14]9.9365,[15]9.8375,
SKIP   5 + [] - len:   1, best:(  3: 0.503)
[1]4.7560,[2]5.6872,[3]6.5117,[4]7.9505,[5]8.0796,[6]8.0997,[7]8.4300,[8]8.6248,[9]8.9526,[10]9.4171,[11]9.8881,[12]9.8217,[13]9.9925,[14]10.0375,[15]9.9104,
SKIP   6 + [] - len:   1, best:(  3: 0.503)
[1]4.7269,[2]5.7618,[3]6.6713,[4]8.1161,[5]8.1800,[6]8.2209,[7]8.6348,[8]8.8874,[9]9.1808,[10]9.5889,[11]9.9829,[12]9.9702,[13]10.0973,[14]10.1133,[15]9.9967,
SKIP   7 + [] - len:   1, best:(  3: 0.503)
[1]4.7369,[2]5.6750,[3]6.5589,[4]7.8237,[5]7.8841,[6]8.0079,[7]8.4151,[8]8.5944,[9]8.9217,[10]9.2969,[11]9.6531,[12]9.6263,[13]9.7461,[14]9.7301,[15]9.6351,
SKIP   8 + [] - len:   1, best:(  3: 0.503)
[1]4.9085,[2]5.8590,[3]6.7498,[4]8.1333,[5]8.0202,[6]8.0731,[7]8.3852,[8]8.5672,[9]8.8911,[10]9.3058,[11]9.6699,[12]9.6395,[13]9.7315,[14]9.6822,[15]9.5403,
SKIP   9 + [] - len:   1, best:(  8: 0.416)
[1]4.8458,[2]5.8132,[3]6.6628,[4]7.9854,[5]7.9229,[6]7.9393,[7]8.2711,[8]8.4423,[9]8.6845,[10]9.0942,[11]9.4458,[12]9.3938,[13]9.5357,[14]9.5474,[15]9.3947,
SKIP  10 + [] - len:   1, best:(  9: 0.271)
[1]4.6858,[2]5.7956,[3]6.6840,[4]8.0777,[5]8.0772,[6]8.0979,[7]8.4181,[8]8.6640,[9]8.9657,[10]9.3621,[11]9.7824,[12]9.7380,[13]9.8820,[14]9.8684,[15]9.7765,
SKIP  11 + [] - len:   1, best:(  9: 0.271)
[1]4.8449,[2]5.8188,[3]6.8288,[4]8.2155,[5]8.1955,[6]8.2520,[7]8.6248,[8]8.8352,[9]9.1510,[10]9.5748,[11]9.9520,[12]9.9381,[13]10.0380,[14]10.0218,[15]9.8927,
SKIP  12 + [] - len:   1, best:(  9: 0.271)
[1]4.9824,[2]6.0310,[3]6.9717,[4]8.2145,[5]8.2158,[6]8.2464,[7]8.5373,[8]8.7336,[9]9.0237,[10]9.4073,[11]9.8479,[12]9.7953,[13]9.9127,[14]9.8922,[15]9.8004,
SKIP  13 + [] - len:   1, best:(  9: 0.271)
[1]4.5633,[2]5.6209,[3]6.5623,[4]7.8673,[5]7.8152,[6]7.8238,[7]8.1821,[8]8.3592,[9]8.6304,[10]9.0495,[11]9.4398,[12]9.4489,[13]9.5557,[14]9.5669,[15]9.4107,
SKIP  14 + [] - len:   1, best:(  9: 0.271)
[1]4.6766,[2]5.6530,[3]6.5334,[4]7.7677,[5]7.6948,[6]7.8004,[7]8.1327,[8]8.3340,[9]8.6436,[10]9.0597,[11]9.4348,[12]9.4552,[13]9.6001,[14]9.5957,[15]9.4549,
SKIP  15 + [] - len:   1, best:(  9: 0.271)
[1]4.5461,[2]5.7137,[3]6.5667,[4]7.9436,[5]7.9152,[6]7.9573,[7]8.2673,[8]8.4875,[9]8.7391,[10]9.1194,[11]9.5132,[12]9.5219,[13]9.6589,[14]9.6758,[15]9.5186,
SKIP  16 + [] - len:   1, best:(  9: 0.271)
[1]4.7976,[2]5.7183,[3]6.5924,[4]8.0302,[5]8.0381,[6]8.0229,[7]8.4062,[8]8.5144,[9]8.7318,[10]9.1274,[11]9.4955,[12]9.5074,[13]9.5908,[14]9.5718,[15]9.4805,
SKIP  17 + [] - len:   1, best:(  9: 0.271)
[1]5.4546,[2]6.7222,[3]7.4250,[4]8.8026,[5]8.6021,[6]8.5869,[7]8.8872,[8]9.0804,[9]9.2383,[10]9.6423,[11]10.0127,[12]9.9790,[13]10.1158,[14]10.1016,[15]9.9581,
SKIP  18 + [] - len:   1, best:(  9: 0.271)
[1]5.5202,[2]6.9656,[3]7.6466,[4]8.9527,[5]8.8399,[6]8.8517,[7]9.0270,[8]9.2116,[9]9.4440,[10]9.8859,[11]10.2626,[12]10.1573,[13]10.3209,[14]10.3210,[15]10.2789,
SKIP  19 + [] - len:   1, best:(  9: 0.271)
[1]5.8760,[2]7.2104,[3]7.8481,[4]9.3019,[5]9.2630,[6]9.3456,[7]9.6077,[8]9.7740,[9]9.9441,[10]10.2476,[11]10.6340,[12]10.5956,[13]10.8722,[14]10.9034,[15]10.8772,
SKIP  20 + [] - len:   1, best:(  9: 0.271)
[1]5.3984,[2]6.6595,[3]7.3168,[4]8.6544,[5]8.4835,[6]8.4971,[7]8.8408,[8]9.0054,[9]9.2302,[10]9.6328,[11]10.0304,[12]9.9973,[13]10.2040,[14]10.2183,[15]10.2049,
SKIP  21 + [] - len:   1, best:(  9: 0.271)
[1]5.6234,[2]6.5923,[3]7.1903,[4]8.5960,[5]8.4949,[6]8.4669,[7]8.7559,[8]8.9582,[9]9.1227,[10]9.5142,[11]9.8822,[12]9.8122,[13]10.0267,[14]10.0104,[15]10.0064,
SKIP  22 + [] - len:   1, best:(  9: 0.271)
[1]5.0010,[2]6.2361,[3]7.0282,[4]8.5469,[5]8.5398,[6]8.5806,[7]8.9169,[8]9.1300,[9]9.3297,[10]9.6817,[11]10.0695,[12]10.0689,[13]10.3327,[14]10.3087,[15]10.1664,
SKIP  23 + [] - len:   1, best:(  9: 0.271)
[1]4.9697,[2]6.0309,[3]6.9005,[4]8.2507,[5]8.1943,[6]8.3459,[7]8.6792,[8]8.8401,[9]9.1174,[10]9.5209,[11]9.9100,[12]9.9139,[13]10.1099,[14]10.0882,[15]9.9698,
SKIP  24 + [] - len:   1, best:(  9: 0.271)
[1]5.1171,[2]6.3094,[3]7.0250,[4]8.2905,[5]8.3084,[6]8.3706,[7]8.7057,[8]8.9241,[9]9.1891,[10]9.5757,[11]9.9878,[12]9.9786,[13]10.1781,[14]10.1395,[15]10.0020,
SKIP  25 + [] - len:   1, best:(  9: 0.271)
[1]5.1529,[2]5.9587,[3]6.7920,[4]8.2383,[5]8.2406,[6]8.2317,[7]8.5468,[8]8.7335,[9]8.9775,[10]9.3395,[11]9.7088,[12]9.7238,[13]9.8541,[14]9.8407,[15]9.7596,
SKIP  26 + [] - len:   1, best:(  9: 0.271)
[1]5.1326,[2]6.5176,[3]7.2211,[4]8.7373,[5]8.7891,[6]8.8381,[7]9.1640,[8]9.3175,[9]9.5454,[10]9.9396,[11]10.3506,[12]10.3644,[13]10.5291,[14]10.4967,[15]10.4026,
SKIP  27 + [] - len:   1, best:(  9: 0.271)
[1]5.3876,[2]6.4461,[3]7.2376,[4]8.5590,[5]8.4854,[6]8.5356,[7]8.7921,[8]8.9672,[9]9.1980,[10]9.5681,[11]9.9096,[12]9.9131,[13]10.0768,[14]10.1304,[15]10.0756,
SKIP  28 + [] - len:   1, best:(  9: 0.271)
[1]5.6165,[2]6.8424,[3]7.6686,[4]8.9055,[5]8.9364,[6]9.0072,[7]9.3671,[8]9.5748,[9]9.7960,[10]10.2010,[11]10.5971,[12]10.5876,[13]10.7243,[14]10.7202,[15]10.6022,
SKIP  29 + [] - len:   1, best:(  9: 0.271)
[1]4.8609,[2]6.1496,[3]6.9781,[4]8.4739,[5]8.6657,[6]8.8468,[7]9.2004,[8]9.4971,[9]9.6824,[10]10.1350,[11]10.6054,[12]10.6026,[13]10.7781,[14]10.7432,[15]10.5089,
SKIP  30 + [] - len:   1, best:(  9: 0.271)
[1]6.3768,[2]7.3526,[3]8.1487,[4]9.5199,[5]9.3846,[6]9.4655,[7]9.7911,[8]10.0463,[9]10.1724,[10]10.6269,[11]11.0575,[12]11.0653,[13]11.3298,[14]11.3196,[15]11.2386,
SKIP  31 + [] - len:   1, best:(  9: 0.271)
[1]6.8947,[2]8.1065,[3]9.0903,[4]10.8750,[5]10.6093,[6]10.4532,[7]10.6539,[8]10.8622,[9]11.1284,[10]11.5710,[11]11.9825,[12]12.0459,[13]12.4071,[14]12.4002,[15]12.3001,

Interestingly, the first pass has 9, 13, 14, 16, 15, 8, 3, 7, 25, 10, 12, 4, 11, 5, 17, 23, 6, 24, 21, 27, 22, 20, 18, 26, 29, 2, 28, 19, 30, 31, 1, 0 from least impactful to most. However, incrementally skipping the least impactful layer results in order 9, 14, 13, 25, 22, 10, 27, 24, 23, 8, 20, 15, 12. So the layers you've already skipped can affect the impact of other layers.

@Galunid
Copy link
Collaborator

Galunid commented Oct 10, 2023

Talking about skipping attention vs MLP layers separately is probably a good indication that my current approach isn't fine-grained enough. I really don't know how to skip those parts separately, rather than just passing the entire layer. It doesn't seem like the skip everything approach can get anything close to reasonable results when skipping more than 10-20% of layers.

It seems pretty interesting. According to Locating and Editing Factual Associations in GPT paper, MLPs seem to store factual information and act as a sort of key-value store. It would make sense to take a perplexity hit. If we look at attention as "recall" of data stored in MLPs, then it seems reasonable why losing just attention wouldn't matter as much. The knowledge would still be there, it would just contain more noise, that could be "filtered" by latter layers, as opposed to what happens when we skip whole layers.

@BarfingLemurs
Copy link
Contributor

Any guidance on running this? I only see ETA message output when running ./perplexity -m llama-2-Q4_K_M.gguf -f wiki.test.raw.

For 70B Q2_K, is n_layers 80? Maybe this model can fit inside my 24gb gpu, when some layers are skipped.

@KerfuffleV2
Copy link
Collaborator Author

Any guidance on running this?

Sure, I can try to help. Just keep in mind right now I'm a lot less confident that the approach of just skipping whole layers is going to work. You probably can skip a few (also to actually use this in inference or whatever you'd have to hack that also. Basically, this isn't really useful for anything except research right now.

80 should be the right number of layers for LLaMA2 70B. If you scroll down a bit past where you changed n_layers, you'll see a chunk of code commented out, about 456-464. You can re-enable that if you want to see more progress information.

I only see ETA message output

This is probably because running perplexity on a 70B model is going to be quite slow and right now the progress is disabled. So you won't see anything until it's completed 15 chunks (you can also try adjusting that part if you want, it's called test_count). Odds are running a chunk is going to take 10-15sec, so you wouldn't see anything for several minutes.

Q2_K already impacts the model quality a decent amount, so it's possible even if you can skip some layers you might just be better off going down to a 30B. You're certainly welcome to try (and I'd be interested in the output).

Hmm, also, this actually won't immediately help with memory issues because all it does is skip evaluating those layers, they still get loaded. They also still get used for some stuff, like if rope shift occurs.

@KerfuffleV2
Copy link
Collaborator Author

If we look at attention as "recall" of data stored in MLPs, then it seems reasonable why losing just attention wouldn't matter as much.

Makes sense, not that I'm really qualified to have an opinion. I also found this: https://www.lesswrong.com/posts/j84JhErNezMxyK4dH/llm-modularity-the-separability-of-capabilities-in-large

@BarfingLemurs
Copy link
Contributor

BarfingLemurs commented Oct 11, 2023

I'm currently running the 70b test on the gpu (no layers offloaded), the numbers are a bit different to cpu. But I ran the early stage twice and the numbers are deterministic for my system.

GPU
perplexity: tokenizing the input ..
perplexity: tokenization took 711.232 ms
perplexity: calculating perplexity over 655 chunks, batch_size=512
perplexity: 8.47 seconds per pass - ETA 1 hours 32.50 minutes
[1]3.3389,[2]3.5790,[3]4.1117,[4]4.0914,[5]3.9023,[6]3.7787,[7]3.8544,[8]3.8760,[9]3.9032,[10]3.9697,[11]3.9299,[12]4.0449,[13]4.1232,[14]4.2691,[15]4.4698,
SKIP   0 + [] - len:   1, best:( -1: 0.000)
[1]1384.3191,[2]2945.6732,[3]535.7742,[4]723.2406,[5]739.3806,[6]455.2552,[7]334.7692,[8]371.5663,[9]419.9812,[10]323.8006,[11]339.5898,
 
 
CPU
perplexity: tokenizing the input ..
perplexity: tokenization took 626.615 ms
perplexity: calculating perplexity over 655 chunks, batch_size=512
perplexity: 79.66 seconds per pass - ETA 14 hours 29.65 minutes
[1]3.3295,[2]3.5790,[3]4.1072,

@KerfuffleV2
Copy link
Collaborator Author

KerfuffleV2 commented Oct 11, 2023

Even on GPU it's going to take a loooooong time. 8.47sec*15*80 is about 3 hours per pass through the layers. It'll get a little better each time since a layer is permanently removed but I'd guess it's going to take over a day. Since I'm going to be changing stuff soon (hopefully, assuming I can actually figure it out) you probably don't need to stress about generating that data if it's inconvenient, and it's not 100% clear how it could be used. Might be interesting to just let it run one pass through the layers though and see how the results relate to the smaller models.

edit: By the way, you might be able to speed it up some by offloading layers. Even though it's purely prompt processing, offloading layers still seems to help. Also if you have most of the model offloaded, you'll usually want to set the number of threads fairly low.

@FNsi
Copy link
Contributor

FNsi commented Oct 11, 2023

I just have some idea, offloading specific layers to gpu, let's say attn layers only, will the speed quicker than just loading them in sequence?

@KerfuffleV2
Copy link
Collaborator Author

let's say attn layers only, will the speed quicker than just loading them in sequence?

I'm not completely sure what you mean. Generally speaking, the more stuff you can put on the GPU the better performance will be. Also, the result from running attention is needed for the MLP/FFN layer so if you run only one of MLP or attention on the GPU you'll have to arrange to copy the data back and forth.

So I would guess you probably don't want to be in a situation where you only put the attention layers on GPU but not MLP or vice versa.

@BarfingLemurs
Copy link
Contributor

@KerfuffleV2 oops. I think I botched recording the full output. Its still running though btw. the output has exceeded the terminal window, and I'm not sure I added > output.txt to the last command.

But here is my recent output with https://huggingface.co/firelzrd/Xwin-LM-70B-V0.1-GGUF/blob/main/Xwin-LM-70B-V0.1.Q2_K.gguf : https://pastebin.com/zYcGydkP

@KerfuffleV2
Copy link
Collaborator Author

I think I botched recording the full output. Its still running though btw.

Thanks! That includes at least the first few passes through the layers, I'd say the first 1-2 are probably what's most interesting here. You can probably just stop it now. :)

I'm looking into how to skip attention and MLP separately but I got kind of distracted trying to refactor the graph building to be more understandable. Might just fall into the rabbit hole of trying to close #3382

@FNsi
Copy link
Contributor

FNsi commented Oct 12, 2023

let's say attn layers only, will the speed quicker than just loading them in sequence?

So I would guess you probably don't want to be in a situation where you only put the attention layers on GPU but not MLP or vice versa.

Sorry I have to confess that's exactly what my naive thinking😂, change the loading sequence in limited vram is what I mainly considered.(like starting at layer 2 instead of the beginning)

@KerfuffleV2
Copy link
Collaborator Author

change the loading sequence in limited vram is what I mainly considered.(like starting at layer 2 instead of the beginning)

That's pretty much how it works, except the attention/FFN layers are handled together (rather than being able to skip one of those parts). When you set -ngl it runs layers on CPU first and the the remaining layers (according to what -ngl was set to) get run on the GPU. That's so the data required for the final calculations is conveniently already on the GPU. Doing it the other way around would require copying data to the GPU and then back out again.

@BarfingLemurs
Copy link
Contributor

BarfingLemurs commented Oct 12, 2023

looking into how to skip attention and MLP separately

Memory question: if you do replicate the draft scheme they showed, are you left with a model only 57.5% the size of original f16 (34 of 80 for the 13B)?
In the paper, they have also said 70B has more redundant layers, so their optimizer probably skipped more.

I reran the test a bit with https://huggingface.co/TheBloke/Llama-2-70B-GGUF/blob/main/llama-2-70b.Q2_K.gguf so we can see the numbers with the base model:

[1]3.1639,[2]3.4754,[3]4.0250,[4]3.8090,[5]3.6399,[6]3.5090,[7]3.5630,[8]3.5419,[9]3.5190,[10]3.4937,[11]3.4496,[12]3.5332,[13]3.6063,[14]3.7551,[15]3.9163,
SKIP   0 + [] - len:   1, best:( -1: 0.000)
[1]1462.3870,[2]3280.1250,[3]732.5060,[4]914.2485,[5]878.1707,[6]550.5528,[7]409.0272,[8]415.6872,[9]493.2747,[10]382.0834,[11]391.0192,[12]464.9062,[13]501.1722,[14]448.7793,[15]488.9007,
SKIP   1 + [] - len:   1, best:(  0: 484.984)
[1]3.9036,[2]4.3790,[3]5.0800,[4]5.0249,[5]4.7755,[6]4.6037,[7]4.6228,[8]4.6937,[9]4.7231,[10]4.8783,[11]4.9603,[12]5.0304,[13]5.1197,[14]5.3222,[15]5.5095,
SKIP   2 + [] - len:   1, best:(  1: 1.593)
[1]839.8436,[2]1024.5593,[3]1004.6455,[4]1010.8268,[5]901.3454,[6]884.6457,[7]885.0874,[8]898.0362,[9]915.6009,[10]939.3151,[11]944.1636,[12]958.8118,[13]973.8382,[14]1002.5948,[15]1039.0967,
SKIP   3 + [] - len:   1, best:(  1: 1.593)
[1]3.2495,[2]3.6108,[3]4.1711,[4]4.0842,[5]3.9257,[6]3.7492,[7]3.8162,[8]3.8320,[9]3.8736,[10]3.8742,[11]3.8487,[12]3.9316,[13]4.0345,[14]4.1757,[15]4.3473,
SKIP   4 + [] - len:   1, best:(  3: 0.431)
[1]3.1809,[2]3.5237,[3]4.0688,[4]3.8928,[5]3.7546,[6]3.6219,[7]3.6843,[8]3.6939,[9]3.7161,[10]3.7086,[11]3.6853,[12]3.7707,[13]3.8532,[14]3.9973,[15]4.1590,
SKIP   5 + [] - len:   1, best:(  4: 0.243)
[1]3.1864,[2]3.5631,[3]4.0909,[4]3.8841,[5]3.7402,[6]3.6336,[7]3.6823,[8]3.6752,[9]3.6811,[10]3.6861,[11]3.6652,[12]3.7468,[13]3.8158,[14]3.9548,[15]4.1239,
SKIP   6 + [] - len:   1, best:(  5: 0.208)
[1]3.2621,[2]3.5673,[3]4.1199,[4]3.9661,[5]3.8007,[6]3.6741,[7]3.7129,[8]3.6867,[9]3.6816,[10]3.6780,[11]3.6427,[12]3.7266,[13]3.8006,[14]3.9432,[15]4.1191,
SKIP   7 + [] - len:   1, best:(  6: 0.203)
[1]3.2388,[2]3.5318,[3]4.0826,[4]3.9570,[5]3.7926,[6]3.6846,[7]3.7364,[8]3.6957,[9]3.7102,[10]3.6845,[11]3.6542,[12]3.7334,[13]3.8141,[14]3.9553,[15]4.1308,
SKIP   8 + [] - len:   1, best:(  6: 0.203)
[1]8.6022,[2]10.5272,[3]11.0301,[4]14.5153,[5]16.8427,[6]17.0697,[7]17.8719,[8]17.0929,[9]18.0672,[10]18.9638,[11]20.9256,[12]21.2242,[13]21.9703,[14]22.5669,[15]24.0980,
SKIP   9 + [] - len:   1, best:(  6: 0.203)
[1]3.1473,[2]3.4990,[3]4.0815,[4]3.9225,[5]3.7582,[6]3.6560,[7]3.7186,[8]3.7039,[9]3.7302,[10]3.7294,[11]3.7041,[12]3.7926,[13]3.8632,[14]3.9986,[15]4.1621,
SKIP  10 + [] - len:   1, best:(  6: 0.203)
[1]3.1805,[2]3.4881,[3]4.0631,[4]3.8501,[5]3.7122,[6]3.5982,[7]3.6566,[8]3.6402,[9]3.6477,[10]3.6425,[11]3.6261,[12]3.7018,[13]3.7720,[14]3.9244,[15]4.0893,
SKIP  11 + [] - len:   1, best:( 10: 0.173)
[1]3.1362,[2]3.4702,[3]4.0423,[4]3.8388,[5]3.6808,[6]3.5750,[7]3.6429,[8]3.6334,[9]3.6180,[10]3.6093,[11]3.5993,[12]3.6648,[13]3.7320,[14]3.8807,[15]4.0474,
SKIP  12 + [] - len:   1, best:( 11: 0.131)
[1]3.3096,[2]3.5603,[3]4.1299,[4]3.9648,[5]3.8146,[6]3.6639,[7]3.6914,[8]3.6930,[9]3.6971,[10]3.7014,[11]3.6834,[12]3.7534,[13]3.8300,[14]3.9666,[15]4.1374,
SKIP  13 + [] - len:   1, best:( 11: 0.131)
[1]3.1917,[2]3.4938,[3]4.0609,[4]3.9088,[5]3.7594,[6]3.6372,[7]3.6939,[8]3.6720,[9]3.6644,[10]3.6318,[11]3.5936,[12]3.6935,[13]3.7713,[14]3.9222,[15]4.0915,
SKIP  14 + [] - len:   1, best:( 11: 0.131)
[1]3.1966,[2]3.4982,[3]4.0898,[4]3.8736,[5]3.7332,[6]3.6061,[7]3.6632,[8]3.6420,[9]3.6305,[10]3.6287,[11]3.5862,[12]3.6722,[13]3.7371,[14]3.8853,[15]4.0553,
SKIP  15 + [] - len:   1, best:( 11: 0.131)
[1]3.2079,[2]3.5124,[3]4.0691,[4]3.8761,[5]3.7128,[6]3.5991,[7]3.6516,[8]3.6454,[9]3.6538,[10]3.6340,[11]3.5978,[12]3.6846,[13]3.7538,[14]3.8996,[15]4.0670,
SKIP  16 + [] - len:   1, best:( 11: 0.131)
[1]3.1996,[2]3.4942,[3]4.0542,[4]3.8908,[5]3.7494,[6]3.6420,[7]3.6914,[8]3.6795,[9]3.6658,[10]3.6479,[11]3.6316,[12]3.7103,[13]3.7799,[14]3.9293,[15]4.0936,
SKIP  17 + [] - len:   1, best:( 11: 0.131)
[1]3.1940,[2]3.4999,[3]4.0522,[4]3.8711,[5]3.7608,[6]3.6434,[7]3.7254,[8]3.7229,[9]3.7156,[10]3.7097,[11]3.6825,[12]3.7603,[13]3.8337,[14]3.9950,[15]4.1641,
SKIP  18 + [] - len:   1, best:( 11: 0.131)
[1]3.1978,[2]3.5349,[3]4.1047,[4]3.9754,[5]3.8337,[6]3.7311,[7]3.7724,[8]3.7527,[9]3.7287,[10]3.7147,[11]3.6663,[12]3.7741,[13]3.8495,[14]3.9927,[15]4.1569,
SKIP  19 + [] - len:   1, best:( 11: 0.131)
[1]3.1948,[2]3.4838,[3]4.0570,[4]3.9187,[5]3.8139,[6]3.6829,[7]3.7679,[8]3.7402,[9]3.7347,[10]3.7610,[11]3.7392,[12]3.8139,[13]3.8778,[14]4.0168,[15]4.1874,
SKIP  20 + [] - len:   1, best:( 11: 0.131)
[1]3.1625,[2]3.5059,[3]4.0758,[4]3.9259,[5]3.7693,[6]3.6638,[7]3.7221,[8]3.6978,[9]3.6730,[10]3.6671,[11]3.6289,[12]3.7029,[13]3.7703,[14]3.9231,[15]4.0915,
SKIP  21 + [] - len:   1, best:( 11: 0.131)
[1]3.1975,[2]3.5121,[3]4.0775,[4]3.8914,[5]3.7461,[6]3.6218,[7]3.6953,[8]3.6829,[9]3.6609,[10]3.6630,[11]3.6305,[12]3.7083,[13]3.7802,[14]3.9236,[15]4.0959,
SKIP  22 + [] - len:   1, best:( 11: 0.131)
[1]3.1655,[2]3.4968,[3]4.0460,[4]3.8810,[5]3.7495,[6]3.6256,[7]3.6870,[8]3.6651,[9]3.6548,[10]3.6133,[11]3.5716,[12]3.6518,[13]3.7180,[14]3.8668,[15]4.0377,
SKIP  23 + [] - len:   1, best:( 22: 0.121)
[1]3.1457,[2]3.5002,[3]4.0820,[4]3.8798,[5]3.7515,[6]3.6131,[7]3.6817,[8]3.6715,[9]3.6437,[10]3.6144,[11]3.5816,[12]3.6725,[13]3.7459,[14]3.8901,[15]4.0564,
SKIP  24 + [] - len:   1, best:( 22: 0.121)
[1]3.2222,[2]3.5191,[3]4.0650,[4]3.9489,[5]3.7815,[6]3.6498,[7]3.7204,[8]3.7094,[9]3.6970,[10]3.6869,[11]3.6351,[12]3.7169,[13]3.7856,[14]3.9305,[15]4.0962,
SKIP  25 + [] - len:   1, best:( 22: 0.121)
[1]3.1924,[2]3.5519,[3]4.1027,[4]3.9609,[5]3.7987,[6]3.6898,[7]3.7781,[8]3.7625,[9]3.7719,[10]3.7911,[11]3.7746,[12]3.8653,[13]3.9461,[14]4.0832,[15]4.2420,
SKIP  26 + [] - len:   1, best:( 22: 0.121)
[1]3.1752,[2]3.5110,[3]4.0673,[4]3.9187,[5]3.7651,[6]3.6421,[7]3.7244,[8]3.7204,[9]3.7155,[10]3.7027,[11]3.6597,[12]3.7316,[13]3.8038,[14]3.9447,[15]4.1109,
SKIP  27 + [] - len:   1, best:( 22: 0.121)
[1]3.2912,[2]3.5872,[3]4.1053,[4]3.9330,[5]3.7758,[6]3.6261,[7]3.6757,[8]3.6673,[9]3.6615,[10]3.6586,[11]3.6167,[12]3.7079,[13]3.7823,[14]3.9382,[15]4.1052,
SKIP  28 + [] - len:   1, best:( 22: 0.121)
[1]3.2757,[2]3.5882,[3]4.1326,[4]3.9924,[5]3.8422,[6]3.7077,[7]3.7647,[8]3.7254,[9]3.7197,[10]3.7163,[11]3.6596,[12]3.7510,[13]3.8130,[14]3.9593,[15]4.1296,
SKIP  29 + [] - len:   1, best:( 22: 0.121)
[1]3.2301,[2]3.5591,[3]4.1059,[4]3.8970,[5]3.7730,[6]3.6549,[7]3.7285,[8]3.7342,[9]3.7150,[10]3.7130,[11]3.6836,[12]3.7629,[13]3.8315,[14]3.9773,[15]4.1363,
SKIP  30 + [] - len:   1, best:( 22: 0.121)
[1]3.1733,[2]3.4947,[3]4.0589,[4]3.8553,[5]3.7397,[6]3.6276,[7]3.7061,[8]3.6945,[9]3.6906,[10]3.6731,[11]3.6382,[12]3.7288,[13]3.7943,[14]3.9450,[15]4.1038,
SKIP  31 + [] - len:   1, best:( 22: 0.121)
[1]3.1980,[2]3.4878,[3]4.0598,[4]3.9049,[5]3.7227,[6]3.5874,[7]3.6585,[8]3.6598,[9]3.6427,[10]3.6575,[11]3.6188,[12]3.7093,[13]3.7740,[14]3.9189,[15]4.0838,
SKIP  32 + [] - len:   1, best:( 22: 0.121)
[1]3.1967,[2]3.5264,[3]4.0799,[4]3.9222,[5]3.7641,[6]3.6085,[7]3.6826,[8]3.6749,[9]3.6569,[10]3.6395,[11]3.5974,[12]3.6886,[13]3.7533,[14]3.9090,[15]4.0698,
SKIP  33 + [] - len:   1, best:( 22: 0.121)
[1]3.2536,[2]3.5311,[3]4.0928,[4]3.9691,[5]3.7964,[6]3.6660,[7]3.7280,[8]3.7235,[9]3.7372,[10]3.7326,[11]3.6970,[12]3.7678,[13]3.8316,[14]3.9719,[15]4.1357,
SKIP  34 + [] - len:   1, best:( 22: 0.121)
[1]3.2113,[2]3.5517,[3]4.0945,[4]3.9117,[5]3.7731,[6]3.6422,[7]3.6993,[8]3.6980,[9]3.6880,[10]3.6853,[11]3.6658,[12]3.7479,[13]3.8082,[14]3.9541,[15]4.1212,
SKIP  35 + [] - len:   1, best:( 22: 0.121)
[1]3.1322,[2]3.4865,[3]4.0575,[4]3.8821,[5]3.7384,[6]3.6031,[7]3.6865,[8]3.6646,[9]3.6530,[10]3.6435,[11]3.6173,[12]3.6971,[13]3.7675,[14]3.9158,[15]4.0816,
SKIP  36 + [] - len:   1, best:( 22: 0.121)
[1]3.2517,[2]3.5650,[3]4.1361,[4]3.9662,[5]3.8003,[6]3.6861,[7]3.7610,[8]3.7675,[9]3.7539,[10]3.7559,[11]3.7227,[12]3.7867,[13]3.8533,[14]4.0028,[15]4.1713,
SKIP  37 + [] - len:   1, best:( 22: 0.121)
[1]3.2112,[2]3.5160,[3]4.0763,[4]3.9181,[5]3.7601,[6]3.6315,[7]3.7076,[8]3.6798,[9]3.6632,[10]3.6778,[11]3.6452,[12]3.7246,[13]3.7869,[14]3.9419,[15]4.1045,
SKIP  38 + [] - len:   1, best:( 22: 0.121)
[1]3.1831,[2]3.5295,[3]4.0800,[4]3.9805,[5]3.8514,[6]3.7333,[7]3.7721,[8]3.7462,[9]3.7211,[10]3.7399,[11]3.7050,[12]3.7951,[13]3.8602,[14]4.0110,[15]4.1692,
SKIP  39 + [] - len:   1, best:( 22: 0.121)
[1]3.2736,[2]3.5687,[3]4.1442,[4]4.0032,[5]3.8460,[6]3.7117,[7]3.7643,[8]3.7869,[9]3.8090,[10]3.8200,[11]3.7862,[12]3.8460,[13]3.9105,[14]4.0616,[15]4.2274,
SKIP  40 + [] - len:   1, best:( 22: 0.121)
[1]3.2705,[2]3.5562,[3]4.1205,[4]3.9892,[5]3.8775,[6]3.7712,[7]3.8344,[8]3.8085,[9]3.8116,[10]3.8267,[11]3.7961,[12]3.8867,[13]3.9643,[14]4.1132,[15]4.2774,
SKIP  41 + [] - len:   1, best:( 22: 0.121)
[1]3.2943,[2]3.5680,[3]4.1164,[4]4.0244,[5]3.9293,[6]3.8314,[7]3.8997,[8]3.8762,[9]3.8543,[10]3.8482,[11]3.8331,[12]3.9209,[13]4.0013,[14]4.1430,[15]4.3076,
SKIP  42 + [] - len:   1, best:( 22: 0.121)
[1]3.3232,[2]3.6431,[3]4.1754,[4]3.9957,[5]3.8752,[6]3.7469,[7]3.8056,[8]3.7962,[9]3.7857,[10]3.7800,[11]3.7583,[12]3.8538,[13]3.9093,[14]4.0552,[15]4.2101,
SKIP  43 + [] - len:   1, best:( 22: 0.121)
[1]3.2278,[2]3.5241,[3]4.0784,[4]3.9260,[5]3.7932,[6]3.6958,[7]3.7935,[8]3.7844,[9]3.7537,[10]3.7314,[11]3.7036,[12]3.7989,[13]3.8649,[14]4.0130,[15]4.1816,
SKIP  44 + [] - len:   1, best:( 22: 0.121)
[1]3.3428,[2]3.7901,[3]4.3273,[4]4.1790,[5]4.0263,[6]3.8892,[7]3.9691,[8]3.9468,[9]3.8985,[10]3.8947,[11]3.8583,[12]3.9356,[13]4.0101,[14]4.1620,[15]4.3310,
SKIP  45 + [] - len:   1, best:( 22: 0.121)
[1]3.2186,[2]3.5711,[3]4.1001,[4]3.9103,[5]3.7877,[6]3.6680,[7]3.7421,[8]3.7365,[9]3.7289,[10]3.7098,[11]3.6809,[12]3.7642,[13]3.8343,[14]3.9857,[15]4.1448,
SKIP  46 + [] - len:   1, best:( 22: 0.121)
[1]3.2406,[2]3.5263,[3]4.0969,[4]3.9633,[5]3.8338,[6]3.7297,[7]3.8096,[8]3.7945,[9]3.7748,[10]3.7657,[11]3.7258,[12]3.8105,[13]3.8830,[14]4.0395,[15]4.2010,
SKIP  47 + [] - len:   1, best:( 22: 0.121)
[1]3.3148,[2]3.6030,[3]4.1613,[4]4.0098,[5]3.9214,[6]3.8153,[7]3.8868,[8]3.8528,[9]3.8291,[10]3.8127,[11]3.7818,[12]3.8722,[13]3.9409,[14]4.0889,[15]4.2498,
SKIP  48 + [] - len:   1, best:( 22: 0.121)
[1]3.2179,[2]3.5309,[3]4.0858,[4]3.9678,[5]3.8217,[6]3.6870,[7]3.7719,[8]3.7560,[9]3.7405,[10]3.7362,[11]3.7052,[12]3.7861,[13]3.8484,[14]4.0040,[15]4.1594,
SKIP  49 + [] - len:   1, best:( 22: 0.121)
[1]3.2426,[2]3.5363,[3]4.0959,[4]3.9581,[5]3.7886,[6]3.6679,[7]3.7154,[8]3.7027,[9]3.6906,[10]3.6726,[11]3.6351,[12]3.7066,[13]3.7814,[14]3.9314,[15]4.0946,
SKIP  50 + [] - len:   1, best:( 22: 0.121)
[1]3.1341,[2]3.4693,[3]4.0553,[4]3.8958,[5]3.7516,[6]3.6396,[7]3.7118,[8]3.7052,[9]3.6880,[10]3.6720,[11]3.6417,[12]3.7155,[13]3.7699,[14]3.9079,[15]4.0717,
SKIP  51 + [] - len:   1, best:( 22: 0.121)
[1]3.2456,[2]3.5711,[3]4.1263,[4]3.9340,[5]3.8055,[6]3.6947,[7]3.7477,[8]3.7404,[9]3.7228,[10]3.7121,[11]3.6679,[12]3.7520,[13]3.8242,[14]3.9720,[15]4.1337,
SKIP  52 + [] - len:   1, best:( 22: 0.121)
[1]3.2062,[2]3.5198,[3]4.0777,[4]3.9540,[5]3.8162,[6]3.6623,[7]3.7235,[8]3.7008,[9]3.7091,[10]3.7026,[11]3.6790,[12]3.7677,[13]3.8367,[14]3.9864,[15]4.1480,
SKIP  53 + [] - len:   1, best:( 22: 0.121)
[1]3.2149,[2]3.5343,[3]4.0771,[4]3.9298,[5]3.7743,[6]3.6587,[7]3.7255,[8]3.7039,[9]3.6956,[10]3.6921,[11]3.6599,[12]3.7588,[13]3.8293,[14]3.9760,[15]4.1392,
SKIP  54 + [] - len:   1, best:( 22: 0.121)
[1]3.1973,[2]3.5587,[3]4.0960,[4]3.9044,[5]3.7485,[6]3.6502,[7]3.6978,[8]3.6777,[9]3.6647,[10]3.6619,[11]3.6385,[12]3.7312,[13]3.7994,[14]3.9480,[15]4.1094,
SKIP  55 + [] - len:   1, best:( 22: 0.121)
[1]3.1872,[2]3.5284,[3]4.1003,[4]3.9430,[5]3.8131,[6]3.6959,[7]3.7554,[8]3.7458,[9]3.7285,[10]3.7072,[11]3.6837,[12]3.7709,[13]3.8418,[14]3.9878,[15]4.1441,
SKIP  56 + [] - len:   1, best:( 22: 0.121)
[1]3.2613,[2]3.5337,[3]4.0836,[4]3.9158,[5]3.7918,[6]3.6560,[7]3.7123,[8]3.6919,[9]3.6841,[10]3.6752,[11]3.6312,[12]3.7057,[13]3.7727,[14]3.9128,[15]4.0738,
SKIP  57 + [] - len:   1, best:( 22: 0.121)
[1]3.2052,[2]3.5078,[3]4.0640,[4]3.9025,[5]3.7653,[6]3.6413,[7]3.7001,[8]3.6678,[9]3.6470,[10]3.6380,[11]3.6197,[12]3.7172,[13]3.7864,[14]3.9275,[15]4.0937,
SKIP  58 + [] - len:   1, best:( 22: 0.121)
[1]3.1970,[2]3.5160,[3]4.0949,[4]3.9378,[5]3.7683,[6]3.6519,[7]3.7135,[8]3.7201,[9]3.7225,[10]3.7026,[11]3.6659,[12]3.7576,[13]3.8226,[14]3.9711,[15]4.1282,
SKIP  59 + [] - len:   1, best:( 22: 0.121)
[1]3.2180,[2]3.5418,[3]4.1025,[4]3.9615,[5]3.7867,[6]3.6612,[7]3.7195,[8]3.6980,[9]3.6762,[10]3.6712,[11]3.6250,[12]3.7136,[13]3.7913,[14]3.9412,[15]4.1020,
SKIP  60 + [] - len:   1, best:( 22: 0.121)
[1]3.1952,[2]3.5274,[3]4.0837,[4]3.9142,[5]3.7583,[6]3.6336,[7]3.6852,[8]3.6766,[9]3.6563,[10]3.6469,[11]3.6073,[12]3.6936,[13]3.7639,[14]3.9149,[15]4.0779,
SKIP  61 + [] - len:   1, best:( 22: 0.121)
[1]3.2491,[2]3.5274,[3]4.0690,[4]3.9159,[5]3.8063,[6]3.6555,[7]3.7129,[8]3.7210,[9]3.6997,[10]3.6779,[11]3.6524,[12]3.7388,[13]3.8075,[14]3.9504,[15]4.1165,
SKIP  62 + [] - len:   1, best:( 22: 0.121)
[1]3.1579,[2]3.4799,[3]4.0225,[4]3.8885,[5]3.7300,[6]3.6061,[7]3.6677,[8]3.6520,[9]3.6396,[10]3.6252,[11]3.5937,[12]3.6767,[13]3.7540,[14]3.9015,[15]4.0633,
SKIP  63 + [] - len:   1, best:( 22: 0.121)
[1]3.2163,[2]3.5225,[3]4.0776,[4]3.8820,[5]3.7108,[6]3.5829,[7]3.6613,[8]3.6349,[9]3.6466,[10]3.6417,[11]3.5971,[12]3.6723,[13]3.7540,[14]3.9021,[15]4.0652,
SKIP  64 + [] - len:   1, best:( 22: 0.121)
[1]3.1973,[2]3.5476,[3]4.0995,[4]3.9270,[5]3.7872,[6]3.6425,[7]3.7037,[8]3.7014,[9]3.6684,[10]3.6664,[11]3.6431,[12]3.7297,[13]3.7946,[14]3.9371,[15]4.0949,
SKIP  65 + [] - len:   1, best:( 22: 0.121)
[1]3.1740,[2]3.4874,[3]4.0337,[4]3.8787,[5]3.7483,[6]3.6361,[7]3.7028,[8]3.6861,[9]3.6617,[10]3.6629,[11]3.6339,[12]3.7143,[13]3.7756,[14]3.9251,[15]4.0855,
SKIP  66 + [] - len:   1, best:( 22: 0.121)
[1]3.1995,[2]3.5128,[3]4.0618,[4]3.9108,[5]3.7699,[6]3.6292,[7]3.6938,[8]3.6863,[9]3.6730,[10]3.6648,[11]3.6315,[12]3.7134,[13]3.7830,[14]3.9278,[15]4.0875,
SKIP  67 + [] - len:   1, best:( 22: 0.121)
[1]3.2755,[2]3.5865,[3]4.1190,[4]3.9671,[5]3.8602,[6]3.7458,[7]3.8170,[8]3.7899,[9]3.7924,[10]3.7937,[11]3.7746,[12]3.8573,[13]3.9164,[14]4.0690,[15]4.2266,
SKIP  68 + [] - len:   1, best:( 22: 0.121)
[1]3.2708,[2]3.5640,[3]4.1202,[4]3.9274,[5]3.7804,[6]3.6487,[7]3.7068,[8]3.6966,[9]3.6906,[10]3.6695,[11]3.6362,[12]3.7206,[13]3.7870,[14]3.9338,[15]4.0983,
SKIP  69 + [] - len:   1, best:( 22: 0.121)
[1]3.1875,[2]3.5131,[3]4.0680,[4]3.8650,[5]3.7459,[6]3.6174,[7]3.6742,[8]3.6700,[9]3.6629,[10]3.6288,[11]3.6024,[12]3.6916,[13]3.7596,[14]3.9011,[15]4.0633,
SKIP  70 + [] - len:   1, best:( 22: 0.121)
[1]3.2348,[2]3.5475,[3]4.1038,[4]3.9617,[5]3.8128,[6]3.6907,[7]3.7293,[8]3.7149,[9]3.7219,[10]3.7052,[11]3.6741,[12]3.7531,[13]3.8239,[14]3.9754,[15]4.1342,
SKIP  71 + [] - len:   1, best:( 22: 0.121)
[1]3.2327,[2]3.5579,[3]4.1098,[4]3.9957,[5]3.8739,[6]3.7195,[7]3.7689,[8]3.7507,[9]3.7302,[10]3.7139,[11]3.6868,[12]3.7737,[13]3.8296,[14]3.9755,[15]4.1440,
SKIP  72 + [] - len:   1, best:( 22: 0.121)
[1]3.1949,[2]3.5480,[3]4.0906,[4]3.9777,[5]3.8409,[6]3.7239,[7]3.7759,[8]3.7398,[9]3.7420,[10]3.7382,[11]3.6918,[12]3.7702,[13]3.8413,[14]3.9878,[15]4.1509,
SKIP  73 + [] - len:   1, best:( 22: 0.121)
[1]3.2464,[2]3.5382,[3]4.0750,[4]3.9336,[5]3.7926,[6]3.6756,[7]3.7241,[8]3.6919,[9]3.6708,[10]3.6624,[11]3.6382,[12]3.7214,[13]3.7797,[14]3.9274,[15]4.0958,
SKIP  74 + [] - len:   1, best:( 22: 0.121)
[1]3.1928,[2]3.5414,[3]4.0931,[4]3.9412,[5]3.8102,[6]3.6901,[7]3.7456,[8]3.7342,[9]3.7218,[10]3.7298,[11]3.7117,[12]3.7937,[13]3.8628,[14]4.0171,[15]4.1779,
SKIP  75 + [] - len:   1, best:( 22: 0.121)
[1]3.2646,[2]3.5975,[3]4.1450,[4]4.0090,[5]3.9237,[6]3.7960,[7]3.8736,[8]3.8383,[9]3.8235,[10]3.8170,[11]3.8064,[12]3.8891,[13]3.9525,[14]4.0990,[15]4.2665,
SKIP  76 + [] - len:   1, best:( 22: 0.121)
[1]3.2759,[2]3.6337,[3]4.1884,[4]4.0413,[5]3.9266,[6]3.7914,[7]3.8767,[8]3.8561,[9]3.8690,[10]3.8871,[11]3.8750,[12]3.9497,[13]4.0068,[14]4.1578,[15]4.3217,
SKIP  77 + [] - len:   1, best:( 22: 0.121)
[1]3.3876,[2]3.7384,[3]4.3271,[4]4.1935,[5]4.1491,[6]4.0635,[7]4.1219,[8]4.0769,[9]4.0750,[10]4.0830,[11]4.0751,[12]4.1693,[13]4.2297,[14]4.3762,[15]4.5502,
SKIP  78 + [] - len:   1, best:( 22: 0.121)
[1]3.4148,[2]3.7716,[3]4.3314,[4]4.2104,[5]4.1185,[6]4.0263,[7]4.0859,[8]4.0983,[9]4.1037,[10]4.1076,[11]4.0947,[12]4.1823,[13]4.2374,[14]4.3749,[15]4.5477,
SKIP  79 + [] - len:   1, best:( 22: 0.121)
[1]3.7952,[2]4.0147,[3]4.9901,[4]4.7875,[5]4.5759,[6]4.4439,[7]4.4801,[8]4.4309,[9]4.4891,[10]4.4547,[11]4.4115,[12]4.5128,[13]4.6090,[14]4.7766,[15]4.9920,

ADD SKIP  22 - ppl vs ref 0.1214
SKIP   0 + [22,] - len:   2, best:( -1: 0.000)
[1]1480.9020,[2]3293.8020,[3]740.0384,[4]925.1013,[5]889.2303,[6]569.9517,[7]417.2760,[8]420.9340,[9]498.4346,[10]385.9437,[11]391.5106,[12]466.3840,[13]501.4208,[14]448.5266,[15]488.3674,
SKIP   1 + [22,] - len:   2, best:(  0: 484.451)
[1]3.8510,[2]4.3297,[3]4.9972,[4]4.9870,[5]4.8028,[6]4.6596,[7]4.6867,[8]4.7214,[9]4.7739,[10]4.8965,[11]4.9578,[12]5.0297,[13]5.1141,[14]5.2957,[15]5.4806,
SKIP   2 + [22,] - len:   2, best:(  1: 1.564)
[1]952.4069,[2]1097.5228,[3]1092.7360,[4]1097.5092,[5]973.0649,[6]950.8042,[7]952.7026,[8]961.6376,[9]975.5414,[10]999.9030,[11]1007.0205,[12]1021.5660,[13]1031.8743,[14]1061.7915,[15]1096.3306,
SKIP   3 + [22,] - len:   2, best:(  1: 1.564)
[1]3.2197,[2]3.6191,[3]4.1684,[4]4.1280,[5]4.0132,[6]3.8693,[7]3.9440,[8]3.9606,[9]4.0228,[10]3.9958,[11]3.9769,[12]4.0611,[13]4.1594,[14]4.3026,[15]4.4817,
SKIP   4 + [22,] - len:   2, best:(  3: 0.565)
[1]3.1893,[2]3.5608,[3]4.0940,[4]3.9734,[5]3.8680,[6]3.7471,[7]3.8261,[8]3.8385,[9]3.8761,[10]3.8384,[11]3.8265,[12]3.9045,[13]3.9803,[14]4.1222,[15]4.2922,
SKIP   5 + [22,] - len:   2, best:(  4: 0.376)
[1]3.1881,[2]3.5937,[3]4.1239,[4]3.9709,[5]3.8468,[6]3.7350,[7]3.7867,[8]3.7824,[9]3.8126,[10]3.8055,[11]3.7994,[12]3.8766,[13]3.9432,[14]4.0789,[15]4.2536,
SKIP   6 + [22,] - len:   2, best:(  5: 0.337)
[1]3.2511,[2]3.5858,[3]4.1325,[4]3.9852,[5]3.8713,[6]3.7701,[7]3.8234,[8]3.8021,[9]3.8135,[10]3.7882,[11]3.7686,[12]3.8403,[13]3.9178,[14]4.0555,[15]4.2435,
SKIP   7 + [22,] - len:   2, best:(  6: 0.327)
[1]3.2462,[2]3.5560,[3]4.1085,[4]4.0280,[5]3.9046,[6]3.7891,[7]3.8459,[8]3.8108,[9]3.8329,[10]3.7904,[11]3.7699,[12]3.8485,[13]3.9256,[14]4.0643,[15]4.2460,
SKIP   8 + [22,] - len:   2, best:(  6: 0.327)
[1]8.6456,[2]9.9541,[3]11.2313,[4]14.8561,[5]17.1303,[6]16.7375,[7]17.8476,[8]17.8939,[9]19.0741,[10]20.2857,[11]21.8564,[12]22.4361,[13]23.2286,[14]24.4904,[15]25.8727,
SKIP   9 + [22,] - len:   2, best:(  6: 0.327)
[1]3.1526,[2]3.5250,[3]4.1026,[4]4.0207,[5]3.8804,[6]3.7768,[7]3.8542,[8]3.8356,[9]3.8708,[10]3.8499,[11]3.8274,[12]3.9106,[13]3.9782,[14]4.1117,[15]4.2813,
SKIP  10 + [22,] - len:   2, best:(  6: 0.327)
[1]3.1728,[2]3.5252,[3]4.0958,[4]3.8949,[5]3.8110,[6]3.7089,[7]3.7761,[8]3.7557,[9]3.7790,[10]3.7564,[11]3.7347,[12]3.8058,[13]3.8721,[14]4.0200,[15]4.1951,
SKIP  11 + [22,] - len:   2, best:( 10: 0.279)
[1]3.1575,[2]3.5095,[3]4.0906,[4]3.9568,[5]3.8357,[6]3.7202,[7]3.7847,[8]3.7727,[9]3.7668,[10]3.7507,[11]3.7388,[12]3.8028,[13]3.8601,[14]4.0063,[15]4.1812,
SKIP  12 + [22,] - len:   2, best:( 11: 0.265)
[1]3.3138,[2]3.5882,[3]4.1570,[4]4.0227,[5]3.9132,[6]3.7781,[7]3.8219,[8]3.8210,[9]3.8381,[10]3.8250,[11]3.8157,[12]3.8844,[13]3.9521,[14]4.0865,[15]4.2637,
SKIP  13 + [22,] - len:   2, best:( 11: 0.265)
[1]3.1876,[2]3.5307,[3]4.0828,[4]3.9656,[5]3.8528,[6]3.7385,[7]3.8022,[8]3.7800,[9]3.7917,[10]3.7569,[11]3.7265,[12]3.8199,[13]3.8874,[14]4.0418,[15]4.2192,
SKIP  14 + [22,] - len:   2, best:( 11: 0.265)
[1]3.2077,[2]3.5364,[3]4.1047,[4]3.9637,[5]3.8578,[6]3.7289,[7]3.7949,[8]3.7767,[9]3.7694,[10]3.7544,[11]3.7092,[12]3.7890,[13]3.8479,[14]3.9924,[15]4.1683,
SKIP  15 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2304,[2]3.5509,[3]4.1204,[4]3.9876,[5]3.8412,[6]3.7365,[7]3.7953,[8]3.7856,[9]3.8086,[10]3.7748,[11]3.7382,[12]3.8227,[13]3.8869,[14]4.0298,[15]4.1991,
SKIP  16 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2027,[2]3.5299,[3]4.0784,[4]3.9745,[5]3.8780,[6]3.7770,[7]3.8306,[8]3.8155,[9]3.8114,[10]3.7781,[11]3.7706,[12]3.8458,[13]3.9094,[14]4.0547,[15]4.2234,
SKIP  17 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2158,[2]3.5330,[3]4.0823,[4]3.9672,[5]3.9023,[6]3.7970,[7]3.8806,[8]3.8707,[9]3.8753,[10]3.8497,[11]3.8227,[12]3.8980,[13]3.9608,[14]4.1129,[15]4.2898,
SKIP  18 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1903,[2]3.5648,[3]4.1339,[4]4.0492,[5]3.9489,[6]3.8504,[7]3.8995,[8]3.8832,[9]3.8880,[10]3.8707,[11]3.8140,[12]3.9167,[13]3.9848,[14]4.1278,[15]4.2988,
SKIP  19 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2050,[2]3.5135,[3]4.0818,[4]4.0058,[5]3.9387,[6]3.8157,[7]3.9120,[8]3.8722,[9]3.8879,[10]3.8887,[11]3.8659,[12]3.9441,[13]4.0041,[14]4.1338,[15]4.3124,
SKIP  20 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1757,[2]3.5390,[3]4.0880,[4]3.9992,[5]3.8831,[6]3.7828,[7]3.8510,[8]3.8357,[9]3.8391,[10]3.8195,[11]3.7826,[12]3.8557,[13]3.9112,[14]4.0611,[15]4.2400,
SKIP  21 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1858,[2]3.5367,[3]4.0893,[4]3.9376,[5]3.8262,[6]3.7069,[7]3.7911,[8]3.7866,[9]3.7879,[10]3.7806,[11]3.7418,[12]3.8173,[13]3.8795,[14]4.0260,[15]4.2044,
SKIP  23 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1763,[2]3.5433,[3]4.1067,[4]3.9703,[5]3.8700,[6]3.7497,[7]3.8144,[8]3.8033,[9]3.7882,[10]3.7532,[11]3.7281,[12]3.8168,[13]3.8829,[14]4.0209,[15]4.1960,
SKIP  24 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2320,[2]3.5426,[3]4.0766,[4]3.9808,[5]3.8600,[6]3.7343,[7]3.8156,[8]3.8003,[9]3.7974,[10]3.7779,[11]3.7447,[12]3.8222,[13]3.8918,[14]4.0364,[15]4.2108,
SKIP  25 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2078,[2]3.5898,[3]4.1284,[4]4.0335,[5]3.9070,[6]3.8180,[7]3.9064,[8]3.8899,[9]3.9199,[10]3.9197,[11]3.8949,[12]3.9774,[13]4.0567,[14]4.1935,[15]4.3592,
SKIP  26 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1791,[2]3.5328,[3]4.0839,[4]3.9891,[5]3.8720,[6]3.7634,[7]3.8482,[8]3.8458,[9]3.8687,[10]3.8464,[11]3.8107,[12]3.8766,[13]3.9432,[14]4.0803,[15]4.2540,
SKIP  27 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2580,[2]3.6062,[3]4.1330,[4]3.9888,[5]3.8799,[6]3.7465,[7]3.7979,[8]3.7869,[9]3.7959,[10]3.7823,[11]3.7496,[12]3.8343,[13]3.9012,[14]4.0541,[15]4.2333,
SKIP  28 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2562,[2]3.5897,[3]4.1394,[4]4.0230,[5]3.9188,[6]3.8023,[7]3.8716,[8]3.8288,[9]3.8484,[10]3.8371,[11]3.8010,[12]3.8825,[13]3.9412,[14]4.0813,[15]4.2588,
SKIP  29 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2595,[2]3.5938,[3]4.1501,[4]4.0077,[5]3.9132,[6]3.7923,[7]3.8709,[8]3.8596,[9]3.8530,[10]3.8355,[11]3.8025,[12]3.8810,[13]3.9483,[14]4.0905,[15]4.2610,
SKIP  30 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1904,[2]3.5347,[3]4.0997,[4]3.9243,[5]3.8447,[6]3.7453,[7]3.8284,[8]3.8200,[9]3.8314,[10]3.8085,[11]3.7762,[12]3.8574,[13]3.9133,[14]4.0606,[15]4.2266,
SKIP  31 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1871,[2]3.5162,[3]4.0900,[4]3.9998,[5]3.8512,[6]3.7215,[7]3.7917,[8]3.7834,[9]3.7824,[10]3.7822,[11]3.7499,[12]3.8384,[13]3.8986,[14]4.0427,[15]4.2183,
SKIP  32 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1979,[2]3.5677,[3]4.1174,[4]4.0128,[5]3.8836,[6]3.7308,[7]3.8057,[8]3.8000,[9]3.7957,[10]3.7646,[11]3.7230,[12]3.8178,[13]3.8763,[14]4.0307,[15]4.2004,
SKIP  33 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2551,[2]3.5680,[3]4.1347,[4]4.0560,[5]3.9257,[6]3.8131,[7]3.8767,[8]3.8654,[9]3.8825,[10]3.8731,[11]3.8466,[12]3.9128,[13]3.9669,[14]4.1040,[15]4.2770,
SKIP  34 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2078,[2]3.5771,[3]4.1311,[4]3.9881,[5]3.8915,[6]3.7635,[7]3.8235,[8]3.8192,[9]3.8210,[10]3.8024,[11]3.7882,[12]3.8651,[13]3.9183,[14]4.0605,[15]4.2395,
SKIP  35 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1401,[2]3.5086,[3]4.0857,[4]3.9648,[5]3.8505,[6]3.7225,[7]3.8143,[8]3.7903,[9]3.7920,[10]3.7728,[11]3.7428,[12]3.8201,[13]3.8823,[14]4.0271,[15]4.2003,
SKIP  36 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2352,[2]3.5817,[3]4.1605,[4]4.0275,[5]3.9114,[6]3.8025,[7]3.8736,[8]3.8732,[9]3.8812,[10]3.8657,[11]3.8316,[12]3.9008,[13]3.9625,[14]4.1141,[15]4.2929,
SKIP  37 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2074,[2]3.5518,[3]4.0989,[4]3.9983,[5]3.8694,[6]3.7561,[7]3.8374,[8]3.8149,[9]3.8202,[10]3.8133,[11]3.7879,[12]3.8702,[13]3.9213,[14]4.0744,[15]4.2471,
SKIP  38 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1766,[2]3.5453,[3]4.0935,[4]4.0726,[5]3.9878,[6]3.8737,[7]3.9096,[8]3.8790,[9]3.8519,[10]3.8573,[11]3.8281,[12]3.9183,[13]3.9738,[14]4.1220,[15]4.2917,
SKIP  39 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2778,[2]3.5990,[3]4.1662,[4]4.0674,[5]3.9595,[6]3.8379,[7]3.8940,[8]3.9195,[9]3.9422,[10]3.9412,[11]3.9161,[12]3.9780,[13]4.0305,[14]4.1769,[15]4.3537,
SKIP  40 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2723,[2]3.5811,[3]4.1395,[4]4.0783,[5]3.9881,[6]3.8884,[7]3.9609,[8]3.9382,[9]3.9532,[10]3.9538,[11]3.9349,[12]4.0154,[13]4.0827,[14]4.2293,[15]4.4032,
SKIP  41 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2965,[2]3.5954,[3]4.1328,[4]4.1022,[5]4.0350,[6]3.9459,[7]4.0195,[8]3.9943,[9]3.9881,[10]3.9840,[11]3.9763,[12]4.0618,[13]4.1315,[14]4.2727,[15]4.4464,
SKIP  42 + [22,] - len:   2, best:( 14: 0.252)
[1]3.3171,[2]3.6603,[3]4.1951,[4]4.0899,[5]3.9906,[6]3.8753,[7]3.9378,[8]3.9288,[9]3.9313,[10]3.9030,[11]3.8931,[12]3.9914,[13]4.0353,[14]4.1791,[15]4.3432,
SKIP  43 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2639,[2]3.5689,[3]4.1137,[4]4.0076,[5]3.9229,[6]3.8347,[7]3.9294,[8]3.9112,[9]3.9022,[10]3.8685,[11]3.8445,[12]3.9356,[13]3.9929,[14]4.1429,[15]4.3216,
SKIP  44 + [22,] - len:   2, best:( 14: 0.252)
[1]3.3433,[2]3.7981,[3]4.3539,[4]4.2873,[5]4.1575,[6]4.0234,[7]4.1042,[8]4.0782,[9]4.0504,[10]4.0378,[11]4.0053,[12]4.0782,[13]4.1406,[14]4.2934,[15]4.4730,
SKIP  45 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2420,[2]3.6017,[3]4.1280,[4]3.9868,[5]3.9008,[6]3.7896,[7]3.8659,[8]3.8575,[9]3.8631,[10]3.8359,[11]3.8036,[12]3.8850,[13]3.9492,[14]4.0964,[15]4.2661,
SKIP  46 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2399,[2]3.5432,[3]4.1141,[4]4.0484,[5]3.9432,[6]3.8342,[7]3.9215,[8]3.9041,[9]3.9077,[10]3.8877,[11]3.8529,[12]3.9349,[13]3.9994,[14]4.1529,[15]4.3238,
SKIP  47 + [22,] - len:   2, best:( 14: 0.252)
[1]3.3097,[2]3.6171,[3]4.1730,[4]4.0996,[5]4.0436,[6]3.9353,[7]4.0186,[8]3.9821,[9]3.9680,[10]3.9403,[11]3.9144,[12]4.0027,[13]4.0623,[14]4.2097,[15]4.3816,
SKIP  48 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2152,[2]3.5518,[3]4.1090,[4]4.0345,[5]3.9299,[6]3.8063,[7]3.9026,[8]3.8837,[9]3.8907,[10]3.8670,[11]3.8383,[12]3.9100,[13]3.9655,[14]4.1194,[15]4.2843,
SKIP  49 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2451,[2]3.5612,[3]4.1103,[4]4.0178,[5]3.8920,[6]3.7793,[7]3.8314,[8]3.8227,[9]3.8249,[10]3.7953,[11]3.7630,[12]3.8261,[13]3.8960,[14]4.0452,[15]4.2181,
SKIP  50 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1353,[2]3.4888,[3]4.0791,[4]3.9508,[5]3.8539,[6]3.7477,[7]3.8197,[8]3.8135,[9]3.8143,[10]3.7755,[11]3.7508,[12]3.8225,[13]3.8727,[14]4.0103,[15]4.1842,
SKIP  51 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2467,[2]3.5945,[3]4.1450,[4]3.9836,[5]3.8971,[6]3.7904,[7]3.8525,[8]3.8463,[9]3.8469,[10]3.8204,[11]3.7795,[12]3.8615,[13]3.9287,[14]4.0764,[15]4.2484,
SKIP  52 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2216,[2]3.5477,[3]4.0989,[4]4.0256,[5]3.9183,[6]3.7839,[7]3.8495,[8]3.8240,[9]3.8466,[10]3.8276,[11]3.8028,[12]3.8860,[13]3.9461,[14]4.0937,[15]4.2681,
SKIP  53 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2215,[2]3.5667,[3]4.1027,[4]4.0046,[5]3.8729,[6]3.7596,[7]3.8329,[8]3.8149,[9]3.8289,[10]3.8110,[11]3.7839,[12]3.8737,[13]3.9343,[14]4.0814,[15]4.2539,
SKIP  54 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2009,[2]3.5851,[3]4.1182,[4]3.9907,[5]3.8742,[6]3.7745,[7]3.8355,[8]3.8183,[9]3.8161,[10]3.7917,[11]3.7677,[12]3.8544,[13]3.9136,[14]4.0616,[15]4.2335,
SKIP  55 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2000,[2]3.5515,[3]4.1196,[4]4.0100,[5]3.9199,[6]3.8088,[7]3.8697,[8]3.8603,[9]3.8582,[10]3.8160,[11]3.7938,[12]3.8747,[13]3.9388,[14]4.0853,[15]4.2503,
SKIP  56 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2680,[2]3.5564,[3]4.1034,[4]3.9997,[5]3.9034,[6]3.7784,[7]3.8388,[8]3.8117,[9]3.8199,[10]3.7896,[11]3.7452,[12]3.8185,[13]3.8793,[14]4.0188,[15]4.1910,
SKIP  57 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2144,[2]3.5330,[3]4.0866,[4]3.9875,[5]3.8774,[6]3.7657,[7]3.8361,[8]3.8065,[9]3.8024,[10]3.7721,[11]3.7573,[12]3.8505,[13]3.9118,[14]4.0515,[15]4.2291,
SKIP  58 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2058,[2]3.5378,[3]4.1160,[4]4.0261,[5]3.8845,[6]3.7781,[7]3.8372,[8]3.8382,[9]3.8475,[10]3.8128,[11]3.7764,[12]3.8598,[13]3.9193,[14]4.0698,[15]4.2380,
SKIP  59 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2181,[2]3.5610,[3]4.1194,[4]4.0249,[5]3.8981,[6]3.7795,[7]3.8452,[8]3.8274,[9]3.8164,[10]3.8001,[11]3.7569,[12]3.8366,[13]3.9017,[14]4.0494,[15]4.2200,
SKIP  60 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1969,[2]3.5500,[3]4.1088,[4]4.0084,[5]3.8872,[6]3.7636,[7]3.8217,[8]3.8116,[9]3.8094,[10]3.7749,[11]3.7378,[12]3.8227,[13]3.8848,[14]4.0321,[15]4.2055,
SKIP  61 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2482,[2]3.5475,[3]4.0873,[4]3.9997,[5]3.9157,[6]3.7786,[7]3.8422,[8]3.8463,[9]3.8497,[10]3.8065,[11]3.7746,[12]3.8574,[13]3.9208,[14]4.0634,[15]4.2390,
SKIP  62 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1571,[2]3.5029,[3]4.0465,[4]3.9692,[5]3.8492,[6]3.7291,[7]3.7982,[8]3.7830,[9]3.7886,[10]3.7546,[11]3.7257,[12]3.7995,[13]3.8676,[14]4.0135,[15]4.1858,
SKIP  63 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2221,[2]3.5464,[3]4.1050,[4]3.9534,[5]3.8205,[6]3.6956,[7]3.7840,[8]3.7584,[9]3.7920,[10]3.7726,[11]3.7283,[12]3.7979,[13]3.8748,[14]4.0235,[15]4.1958,
SKIP  64 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1951,[2]3.5652,[3]4.1130,[4]3.9920,[5]3.8997,[6]3.7608,[7]3.8280,[8]3.8263,[9]3.8155,[10]3.7959,[11]3.7719,[12]3.8539,[13]3.9112,[14]4.0511,[15]4.2205,
SKIP  65 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1864,[2]3.5155,[3]4.0612,[4]3.9696,[5]3.8684,[6]3.7595,[7]3.8337,[8]3.8138,[9]3.8111,[10]3.7919,[11]3.7671,[12]3.8403,[13]3.8944,[14]4.0411,[15]4.2117,
SKIP  66 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2025,[2]3.5321,[3]4.0843,[4]3.9925,[5]3.8874,[6]3.7505,[7]3.8147,[8]3.8102,[9]3.8064,[10]3.7792,[11]3.7439,[12]3.8269,[13]3.8860,[14]4.0309,[15]4.2002,
SKIP  67 + [22,] - len:   2, best:( 14: 0.252)
[1]3.3074,[2]3.6179,[3]4.1491,[4]4.0207,[5]3.9601,[6]3.8549,[7]3.9316,[8]3.9039,[9]3.9180,[10]3.9037,[11]3.8837,[12]3.9740,[13]4.0235,[14]4.1748,[15]4.3427,
SKIP  68 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2985,[2]3.5983,[3]4.1446,[4]3.9940,[5]3.8823,[6]3.7603,[7]3.8273,[8]3.8191,[9]3.8264,[10]3.7914,[11]3.7566,[12]3.8308,[13]3.8911,[14]4.0370,[15]4.2109,
SKIP  69 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2054,[2]3.5443,[3]4.0932,[4]3.9454,[5]3.8631,[6]3.7421,[7]3.8088,[8]3.8008,[9]3.8045,[10]3.7645,[11]3.7326,[12]3.8193,[13]3.8802,[14]4.0210,[15]4.1936,
SKIP  70 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2437,[2]3.5694,[3]4.1228,[4]4.0106,[5]3.9003,[6]3.7826,[7]3.8333,[8]3.8172,[9]3.8405,[10]3.8098,[11]3.7859,[12]3.8653,[13]3.9294,[14]4.0812,[15]4.2508,
SKIP  71 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2321,[2]3.5732,[3]4.1294,[4]4.0595,[5]3.9770,[6]3.8377,[7]3.8897,[8]3.8679,[9]3.8570,[10]3.8283,[11]3.7998,[12]3.8869,[13]3.9353,[14]4.0812,[15]4.2599,
SKIP  72 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1984,[2]3.5712,[3]4.1167,[4]4.0719,[5]3.9758,[6]3.8630,[7]3.9260,[8]3.8881,[9]3.8976,[10]3.8740,[11]3.8295,[12]3.9062,[13]3.9667,[14]4.1113,[15]4.2838,
SKIP  73 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2374,[2]3.5515,[3]4.0886,[4]4.0203,[5]3.9114,[6]3.7939,[7]3.8506,[8]3.8107,[9]3.8084,[10]3.7864,[11]3.7609,[12]3.8399,[13]3.8906,[14]4.0377,[15]4.2141,
SKIP  74 + [22,] - len:   2, best:( 14: 0.252)
[1]3.1988,[2]3.5663,[3]4.1139,[4]4.0245,[5]3.9250,[6]3.8103,[7]3.8702,[8]3.8502,[9]3.8588,[10]3.8421,[11]3.8265,[12]3.9109,[13]3.9693,[14]4.1241,[15]4.2964,
SKIP  75 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2871,[2]3.6302,[3]4.1843,[4]4.0973,[5]4.0463,[6]3.9318,[7]4.0103,[8]3.9717,[9]3.9745,[10]3.9508,[11]3.9399,[12]4.0141,[13]4.0716,[14]4.2190,[15]4.3970,
SKIP  76 + [22,] - len:   2, best:( 14: 0.252)
[1]3.2827,[2]3.6530,[3]4.2071,[4]4.1064,[5]4.0351,[6]3.9079,[7]4.0010,[8]3.9760,[9]4.0103,[10]4.0084,[11]3.9969,[12]4.0718,[13]4.1230,[14]4.2727,[15]4.4451,
SKIP  77 + [22,] - len:   2, best:( 14: 0.252)
[1]3.3825,[2]3.7569,[3]4.3464,[4]4.2496,[5]4.2576,[6]4.1796,[7]4.2334,[8]4.1924,[9]4.1996,[10]4.1996,[11]4.1930,[12]4.2830,[13]4.3356,[14]4.4817,[15]4.6655,
SKIP  78 + [22,] - len:   2, best:( 14: 0.252)
[1]3.4235,[2]3.7912,[3]4.3473,[4]4.2685,[5]4.2033,[6]4.1170,[7]4.1941,[8]4.2092,[9]4.2220,[10]4.2197,[11]4.2077,[12]4.2920,[13]4.3429,[14]4.4776,[15]4.6634,
SKIP  79 + [22,] - len:   2, best:( 14: 0.252)
[1]3.7994,[2]4.0601,[3]5.0242,[4]4.8580,[5]4.6972,[6]4.5876,[7]4.6128,[8]4.5611,[9]4.6286,[10]4.5772,[11]4.5420,[12]4.6308,[13]4.7267,[14]4.8992,[15]5.1291,

ADD SKIP  14 - ppl vs ref 0.2519
SKIP   0 + [22,14,] - len:   3, best:( -1: 0.000)
[1]1431.6291,[2]3172.2087,[3]779.0677,[4]963.3765,[5]956.6645,[6]581.3501,[7]417.9766,[8]421.5114,[9]501.2120,[10]396.6861,[11]405.3641,[12]481.9605,[13]520.7281,[14]470.8351,[15]517.4556,
SKIP   1 + [22,14,] - len:   3, best:(  0: 513.539)
[1]3.7501,[2]4.2929,[3]4.9838,[4]5.0334,[5]4.8926,[6]4.7763,[7]4.8429,[8]4.9101,[9]4.9857,[10]5.1168,[11]5.1446,[12]5.2397,[13]5.3353,[14]5.5079,[15]5.7015,
SKIP   2 + [22,14,] - len:   3, best:(  1: 1.785)
[1]1290.4027,[2]1403.9919,[3]1393.0244,[4]1373.2305,[5]1217.1674,[6]1141.3133,[7]1140.6325,[8]1141.2967,[9]1140.0872,[10]1168.2436,[11]1162.2406,[12]1178.5551,[13]1176.3234,[14]1215.6206,[15]1238.6543,
SKIP   3 + [22,14,] - len:   3, best:(  1: 1.785)
[1]3.2663,[2]3.6365,[3]4.2145,[4]4.2207,[5]4.1262,[6]3.9785,[7]4.0700,[8]4.0875,[9]4.1625,[10]4.1748,[11]4.1558,[12]4.2390,[13]4.3368,[14]4.4835,[15]4.6682,
SKIP   4 + [22,14,] - len:   3, best:(  3: 0.752)
[1]3.2267,[2]3.5832,[3]4.1278,[4]4.0150,[5]3.9392,[6]3.8288,[7]3.9190,[8]3.9297,[9]3.9722,[10]3.9586,[11]3.9457,[12]4.0210,[13]4.1000,[14]4.2523,[15]4.4312,
SKIP   5 + [22,14,] - len:   3, best:(  4: 0.515)
[1]3.2133,[2]3.6062,[3]4.1649,[4]4.0417,[5]3.9420,[6]3.8396,[7]3.9007,[8]3.8957,[9]3.9366,[10]3.9586,[11]3.9456,[12]4.0228,[13]4.0847,[14]4.2330,[15]4.4124,
SKIP   6 + [22,14,] - len:   3, best:(  5: 0.496)
[1]3.3215,[2]3.6566,[3]4.2119,[4]4.1055,[5]4.0112,[6]3.9062,[7]3.9817,[8]3.9852,[9]4.0016,[10]3.9941,[11]3.9595,[12]4.0443,[13]4.1131,[14]4.2583,[15]4.4482,
SKIP   7 + [22,14,] - len:   3, best:(  5: 0.496)
[1]3.2703,[2]3.6078,[3]4.1847,[4]4.1728,[5]4.0587,[6]3.9174,[7]3.9827,[8]3.9651,[9]4.0005,[10]3.9971,[11]3.9639,[12]4.0405,[13]4.1090,[14]4.2579,[15]4.4415,
SKIP   8 + [22,14,] - len:   3, best:(  5: 0.496)
[1]10.1753,[2]12.2510,[3]13.5520,[4]17.8157,[5]19.8106,[6]19.8836,[7]20.7475,[8]20.1508,[9]22.0124,[10]23.6726,[11]25.4370,[12]27.2145,[13]28.4720,[14]28.8096,[15]30.9556,
SKIP   9 + [22,14,] - len:   3, best:(  5: 0.496)
[1]3.1747,[2]3.5442,[3]4.1421,[4]4.0876,[5]3.9793,[6]3.8803,[7]3.9711,[8]3.9557,[9]3.9770,[10]3.9957,[11]3.9589,[12]4.0356,[13]4.1012,[14]4.2410,[15]4.4212,
SKIP  10 + [22,14,] - len:   3, best:(  5: 0.496)
[1]3.2110,[2]3.5569,[3]4.1421,[4]3.9806,[5]3.9190,[6]3.8050,[7]3.8783,[8]3.8822,[9]3.9002,[10]3.8938,[11]3.8657,[12]3.9423,[13]4.0056,[14]4.1523,[15]4.3373,
SKIP  11 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.1792,[2]3.5619,[3]4.1515,[4]4.0317,[5]3.9441,[6]3.8285,[7]3.8925,[8]3.8905,[9]3.9086,[10]3.9301,[11]3.9150,[12]3.9796,[13]4.0382,[14]4.1837,[15]4.3583,
SKIP  12 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.3499,[2]3.6537,[3]4.2382,[4]4.0934,[5]4.0086,[6]3.8824,[7]3.9542,[8]3.9508,[9]3.9890,[10]3.9950,[11]3.9769,[12]4.0510,[13]4.1146,[14]4.2562,[15]4.4421,
SKIP  13 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2639,[2]3.6140,[3]4.1815,[4]4.1293,[5]4.0379,[6]3.9073,[7]4.0101,[8]3.9902,[9]4.0119,[10]3.9972,[11]3.9523,[12]4.0470,[13]4.1090,[14]4.2643,[15]4.4507,
SKIP  15 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2955,[2]3.6220,[3]4.2120,[4]4.1150,[5]3.9896,[6]3.8799,[7]3.9607,[8]3.9483,[9]3.9532,[10]3.9387,[11]3.9061,[12]3.9839,[13]4.0456,[14]4.1936,[15]4.3607,
SKIP  16 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2647,[2]3.5964,[3]4.1447,[4]4.0814,[5]4.0119,[6]3.9285,[7]3.9986,[8]3.9855,[9]3.9880,[10]3.9744,[11]3.9655,[12]4.0445,[13]4.1073,[14]4.2464,[15]4.4222,
SKIP  17 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2688,[2]3.6030,[3]4.1836,[4]4.0799,[5]4.0463,[6]3.9356,[7]4.0294,[8]4.0396,[9]4.0525,[10]4.0492,[11]4.0223,[12]4.0973,[13]4.1463,[14]4.2889,[15]4.4692,
SKIP  18 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2316,[2]3.6127,[3]4.1919,[4]4.1423,[5]4.0860,[6]3.9705,[7]4.0330,[8]4.0202,[9]4.0437,[10]4.0564,[11]4.0101,[12]4.1094,[13]4.1666,[14]4.3124,[15]4.4874,
SKIP  19 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2429,[2]3.5689,[3]4.1421,[4]4.1077,[5]4.0815,[6]3.9498,[7]4.0410,[8]4.0066,[9]4.0125,[10]4.0473,[11]4.0211,[12]4.0949,[13]4.1544,[14]4.2848,[15]4.4662,
SKIP  20 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.1939,[2]3.5707,[3]4.1390,[4]4.0832,[5]3.9934,[6]3.8846,[7]3.9583,[8]3.9546,[9]3.9597,[10]3.9592,[11]3.9268,[12]4.0033,[13]4.0470,[14]4.1984,[15]4.3860,
SKIP  21 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2374,[2]3.5798,[3]4.1554,[4]4.0575,[5]3.9590,[6]3.8281,[7]3.9357,[8]3.9447,[9]3.9474,[10]3.9483,[11]3.9063,[12]3.9876,[13]4.0390,[14]4.1841,[15]4.3675,
SKIP  23 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.1990,[2]3.5792,[3]4.1538,[4]4.0382,[5]3.9732,[6]3.8647,[7]3.9382,[8]3.9260,[9]3.9296,[10]3.9202,[11]3.8885,[12]3.9865,[13]4.0400,[14]4.1749,[15]4.3549,
SKIP  24 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2740,[2]3.5761,[3]4.1239,[4]4.0759,[5]3.9918,[6]3.8713,[7]3.9470,[8]3.9313,[9]3.9470,[10]3.9396,[11]3.9058,[12]3.9868,[13]4.0510,[14]4.1943,[15]4.3699,
SKIP  25 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2350,[2]3.6120,[3]4.1795,[4]4.1433,[5]4.0480,[6]3.9434,[7]4.0359,[8]4.0249,[9]4.0654,[10]4.0847,[11]4.0601,[12]4.1467,[13]4.2149,[14]4.3540,[15]4.5213,
SKIP  26 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2482,[2]3.6016,[3]4.1582,[4]4.0540,[5]3.9628,[6]3.8446,[7]3.9353,[8]3.9427,[9]3.9743,[10]3.9753,[11]3.9418,[12]4.0097,[13]4.0651,[14]4.1989,[15]4.3791,
SKIP  27 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2933,[2]3.6461,[3]4.1797,[4]4.0921,[5]4.0032,[6]3.8658,[7]3.9255,[8]3.9140,[9]3.9305,[10]3.9256,[11]3.8909,[12]3.9809,[13]4.0364,[14]4.1845,[15]4.3724,
SKIP  28 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2949,[2]3.6306,[3]4.1940,[4]4.0939,[5]4.0199,[6]3.8912,[7]3.9699,[8]3.9401,[9]3.9596,[10]3.9641,[11]3.9305,[12]4.0196,[13]4.0690,[14]4.2070,[15]4.3836,
SKIP  29 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2856,[2]3.6388,[3]4.2034,[4]4.0856,[5]4.0098,[6]3.8796,[7]3.9659,[8]3.9607,[9]3.9786,[10]3.9949,[11]3.9607,[12]4.0447,[13]4.1018,[14]4.2383,[15]4.4113,
SKIP  30 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2209,[2]3.5510,[3]4.1424,[4]4.0176,[5]3.9677,[6]3.8499,[7]3.9312,[8]3.9312,[9]3.9532,[10]3.9427,[11]3.9021,[12]3.9942,[13]4.0409,[14]4.1849,[15]4.3562,
SKIP  31 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2019,[2]3.5399,[3]4.1236,[4]4.0871,[5]3.9745,[6]3.8326,[7]3.9095,[8]3.9217,[9]3.9228,[10]3.9466,[11]3.9202,[12]4.0032,[13]4.0554,[14]4.1983,[15]4.3777,
SKIP  32 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2390,[2]3.6029,[3]4.1725,[4]4.1017,[5]4.0063,[6]3.8526,[7]3.9287,[8]3.9288,[9]3.9338,[10]3.9277,[11]3.8795,[12]3.9740,[13]4.0262,[14]4.1780,[15]4.3590,
SKIP  33 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.3115,[2]3.6272,[3]4.2087,[4]4.1498,[5]4.0454,[6]3.9119,[7]3.9791,[8]3.9700,[9]4.0151,[10]4.0192,[11]3.9926,[12]4.0601,[13]4.1029,[14]4.2350,[15]4.4127,
SKIP  34 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2583,[2]3.6314,[3]4.1887,[4]4.1004,[5]4.0314,[6]3.8820,[7]3.9404,[8]3.9455,[9]3.9554,[10]3.9538,[11]3.9379,[12]4.0128,[13]4.0524,[14]4.1926,[15]4.3788,
SKIP  35 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.1798,[2]3.5738,[3]4.1633,[4]4.0873,[5]3.9814,[6]3.8421,[7]3.9296,[8]3.9136,[9]3.9384,[10]3.9439,[11]3.9135,[12]3.9856,[13]4.0396,[14]4.1826,[15]4.3610,
SKIP  36 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2976,[2]3.6453,[3]4.2322,[4]4.1497,[5]4.0396,[6]3.9213,[7]3.9908,[8]4.0200,[9]4.0326,[10]4.0342,[11]3.9955,[12]4.0636,[13]4.1144,[14]4.2615,[15]4.4439,
SKIP  37 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2579,[2]3.6025,[3]4.1781,[4]4.1382,[5]4.0365,[6]3.9163,[7]3.9957,[8]3.9793,[9]3.9723,[10]3.9839,[11]3.9496,[12]4.0274,[13]4.0713,[14]4.2227,[15]4.3986,
SKIP  38 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2193,[2]3.5861,[3]4.1547,[4]4.1762,[5]4.1040,[6]3.9795,[7]4.0244,[8]3.9901,[9]3.9712,[10]4.0057,[11]3.9785,[12]4.0695,[13]4.1141,[14]4.2574,[15]4.4355,
SKIP  39 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.3205,[2]3.6548,[3]4.2391,[4]4.2002,[5]4.1272,[6]3.9831,[7]4.0352,[8]4.0607,[9]4.1103,[10]4.1215,[11]4.0919,[12]4.1588,[13]4.2026,[14]4.3465,[15]4.5299,
SKIP  40 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.3017,[2]3.6123,[3]4.1978,[4]4.1844,[5]4.1104,[6]4.0040,[7]4.0766,[8]4.0679,[9]4.0985,[10]4.1223,[11]4.1023,[12]4.1784,[13]4.2335,[14]4.3764,[15]4.5558,
SKIP  41 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.3408,[2]3.6379,[3]4.1817,[4]4.1645,[5]4.1347,[6]4.0349,[7]4.1160,[8]4.0876,[9]4.0940,[10]4.0978,[11]4.0883,[12]4.1762,[13]4.2400,[14]4.3799,[15]4.5599,
SKIP  42 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.3532,[2]3.7019,[3]4.2573,[4]4.1793,[5]4.1186,[6]3.9877,[7]4.0538,[8]4.0466,[9]4.0541,[10]4.0501,[11]4.0431,[12]4.1407,[13]4.1778,[14]4.3188,[15]4.4870,
SKIP  43 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2914,[2]3.6010,[3]4.1680,[4]4.0928,[5]4.0300,[6]3.9429,[7]4.0555,[8]4.0451,[9]4.0393,[10]4.0154,[11]3.9906,[12]4.0792,[13]4.1230,[14]4.2690,[15]4.4523,
SKIP  44 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.3907,[2]3.7984,[3]4.3482,[4]4.3474,[5]4.2548,[6]4.1273,[7]4.2121,[8]4.1898,[9]4.1683,[10]4.1751,[11]4.1460,[12]4.2249,[13]4.2853,[14]4.4363,[15]4.6195,
SKIP  45 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2757,[2]3.6362,[3]4.1773,[4]4.0703,[5]4.0132,[6]3.9022,[7]3.9845,[8]3.9838,[9]3.9939,[10]3.9833,[11]3.9451,[12]4.0269,[13]4.0832,[14]4.2269,[15]4.4012,
SKIP  46 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2646,[2]3.5745,[3]4.1635,[4]4.1327,[5]4.0415,[6]3.9260,[7]4.0254,[8]4.0138,[9]4.0208,[10]4.0186,[11]3.9811,[12]4.0574,[13]4.1145,[14]4.2625,[15]4.4377,
SKIP  47 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.3677,[2]3.6706,[3]4.2338,[4]4.1645,[5]4.1385,[6]4.0238,[7]4.1042,[8]4.0758,[9]4.0728,[10]4.0689,[11]4.0366,[12]4.1233,[13]4.1749,[14]4.3213,[15]4.4972,
SKIP  48 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2620,[2]3.5971,[3]4.1749,[4]4.1520,[5]4.0774,[6]3.9386,[7]4.0357,[8]4.0151,[9]4.0267,[10]4.0240,[11]3.9908,[12]4.0604,[13]4.1062,[14]4.2550,[15]4.4219,
SKIP  49 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.2757,[2]3.5927,[3]4.1619,[4]4.1309,[5]4.0135,[6]3.8871,[7]3.9444,[8]3.9363,[9]3.9446,[10]3.9333,[11]3.8996,[12]3.9633,[13]4.0240,[14]4.1714,[15]4.3490,
SKIP  50 + [22,14,] - len:   3, best:( 10: 0.421)
[1]3.1768,[2]3.5292,[3]4.1421,[4]4.0576,[5]3.9778,[6]3.8617,[7]3.9413,[8]3.9304,[9]3.9424,[10]3.9280,[11]3.9000,[12]3.9708,[13]4.0175,[14]4.1519,[15]4.3299,
SKIP  51 + [22,14,] - len:   3, best:( 50: 0.414)
[1]3.2872,[2]3.6394,[3]4.2051,[4]4.0976,[5]4.0313,[6]3.9230,[7]3.9905,[8]3.9884,[9]3.9866,[10]3.9778,[11]3.9397,[12]4.0209,[13]4.0796,[14]4.2220,[15]4.3984,
SKIP  52 + [22,14,] - len:   3, best:( 50: 0.414)
[1]3.2588,[2]3.5934,[3]4.1671,[4]4.0877,[5]4.0208,[6]3.8847,[7]3.9557,[8]3.9360,[9]3.9601,[10]3.9716,[11]3.9395,[12]4.0249,[13]4.0801,[14]4.2247,[15]4.4003,
SKIP  53 + [22,14,] - len:   3, best:( 50: 0.414)
[1]3.2465,[2]3.5941,[3]4.1542,[4]4.1059,[5]3.9944,[6]3.8780,[7]3.9514,[8]3.9404,[9]3.9592,[10]3.9636,[11]3.9370,[12]4.0242,[13]4.0766,[14]4.2213,[15]4.3970,
SKIP  54 + [22,14,] - len:   3, best:( 50: 0.414)
[1]3.2386,[2]3.6224,[3]4.1775,[4]4.0749,[5]3.9826,[6]3.8788,[7]3.9448,[8]3.9333,[9]3.9342,[10]3.9276,[11]3.9011,[12]3.9848,[13]4.0337,[14]4.1781,[15]4.3523,
SKIP  55 + [22,14,] - len:   3, best:( 50: 0.414)
[1]3.2417,[2]3.5948,[3]4.1786,[4]4.0921,[5]4.0261,[6]3.9076,[7]3.9793,[8]3.9752,[9]3.9732,[10]3.9663,[11]3.9447,[12]4.0247,[13]4.0795,[14]4.2233,[15]4.3935,
SKIP  56 + [22,14,] - len:   3, best:( 50: 0.414)
[1]3.3165,[2]3.5997,[3]4.1655,[4]4.0899,[5]4.0190,[6]3.8730,[7]3.9341,[8]3.9175,[9]3.9325,[10]3.9319,[11]3.8901,[12]3.9646,[13]4.0193,[14]4.1529,[15]4.3295,
SKIP  57 + [22,14,] - len:   3, best:( 56: 0.413)
[1]3.2541,[2]3.5685,[3]4.1456,[4]4.0528,[5]3.9704,[6]3.8656,[7]3.9369,[8]3.9120,[9]3.9108,[10]3.9099,[11]3.8939,[12]3.9948,[13]4.0456,[14]4.1800,[15]4.3631,
SKIP  58 + [22,14,] - len:   3, best:( 56: 0.413)
[1]3.2389,[2]3.5805,[3]4.1760,[4]4.1185,[5]4.0019,[6]3.8854,[7]3.9454,[8]3.9502,[9]3.9677,[10]3.9552,[11]3.9221,[12]4.0078,[13]4.0617,[14]4.2106,[15]4.3837,
SKIP  59 + [22,14,] - len:   3, best:( 56: 0.413)
[1]3.2553,[2]3.5991,[3]4.1784,[4]4.1053,[5]3.9933,[6]3.8693,[7]3.9328,[8]3.9216,[9]3.9273,[10]3.9292,[11]3.8923,[12]3.9709,[13]4.0313,[14]4.1744,[15]4.3520,
SKIP  60 + [22,14,] - len:   3, best:( 56: 0.413)
[1]3.2429,[2]3.5917,[3]4.1678,[4]4.0997,[5]4.0089,[6]3.8787,[7]3.9400,[8]3.9347,[9]3.9439,[10]3.9305,[11]3.8871,[12]3.9690,[13]4.0255,[14]4.1707,[15]4.3488,
SKIP  61 + [22,14,] - len:   3, best:( 56: 0.413)
[1]3.2834,[2]3.5842,[3]4.1474,[4]4.0893,[5]4.0240,[6]3.8730,[7]3.9424,[8]3.9485,[9]3.9606,[10]3.9403,[11]3.9081,[12]3.9916,[13]4.0490,[14]4.1891,[15]4.3696,
SKIP  62 + [22,14,] - len:   3, best:( 56: 0.413)
[1]3.1882,[2]3.5327,[3]4.0907,[4]4.0156,[5]3.9301,[6]3.8076,[7]3.8868,[8]3.8815,[9]3.8864,[10]3.8717,[11]3.8397,[12]3.9126,[13]3.9736,[14]4.1185,[15]4.2958,
SKIP  63 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.2593,[2]3.5887,[3]4.1640,[4]4.0642,[5]3.9549,[6]3.8173,[7]3.9005,[8]3.8812,[9]3.9110,[10]3.9214,[11]3.8725,[12]3.9393,[13]4.0103,[14]4.1552,[15]4.3310,
SKIP  64 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.2352,[2]3.6099,[3]4.1729,[4]4.0858,[5]4.0145,[6]3.8720,[7]3.9432,[8]3.9444,[9]3.9351,[10]3.9352,[11]3.9152,[12]3.9961,[13]4.0458,[14]4.1826,[15]4.3553,
SKIP  65 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.2244,[2]3.5550,[3]4.1215,[4]4.0589,[5]3.9819,[6]3.8641,[7]3.9417,[8]3.9307,[9]3.9301,[10]3.9266,[11]3.8993,[12]3.9787,[13]4.0259,[14]4.1700,[15]4.3434,
SKIP  66 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.2479,[2]3.5812,[3]4.1507,[4]4.0948,[5]4.0189,[6]3.8787,[7]3.9427,[8]3.9382,[9]3.9411,[10]3.9344,[11]3.8932,[12]3.9809,[13]4.0334,[14]4.1720,[15]4.3489,
SKIP  67 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.3387,[2]3.6636,[3]4.2071,[4]4.1142,[5]4.0702,[6]3.9616,[7]4.0553,[8]4.0304,[9]4.0470,[10]4.0511,[11]4.0303,[12]4.1193,[13]4.1636,[14]4.3098,[15]4.4822,
SKIP  68 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.3423,[2]3.6333,[3]4.2082,[4]4.0806,[5]3.9891,[6]3.8593,[7]3.9316,[8]3.9295,[9]3.9413,[10]3.9330,[11]3.8998,[12]3.9750,[13]4.0299,[14]4.1725,[15]4.3520,
SKIP  69 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.2508,[2]3.5866,[3]4.1499,[4]4.0283,[5]3.9658,[6]3.8368,[7]3.9078,[8]3.9109,[9]3.9181,[10]3.9040,[11]3.8710,[12]3.9557,[13]4.0065,[14]4.1420,[15]4.3192,
SKIP  70 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.2682,[2]3.5960,[3]4.1679,[4]4.1059,[5]4.0215,[6]3.9009,[7]3.9498,[8]3.9394,[9]3.9647,[10]3.9617,[11]3.9378,[12]4.0155,[13]4.0715,[14]4.2169,[15]4.3894,
SKIP  71 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.2622,[2]3.6078,[3]4.1901,[4]4.1554,[5]4.0918,[6]3.9414,[7]3.9991,[8]3.9790,[9]3.9783,[10]3.9750,[11]3.9505,[12]4.0355,[13]4.0791,[14]4.2208,[15]4.4020,
SKIP  72 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.2513,[2]3.6076,[3]4.1701,[4]4.1401,[5]4.0760,[6]3.9641,[7]4.0276,[8]3.9880,[9]4.0053,[10]4.0009,[11]3.9569,[12]4.0332,[13]4.0884,[14]4.2292,[15]4.4064,
SKIP  73 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.2859,[2]3.5959,[3]4.1497,[4]4.0930,[5]4.0158,[6]3.8898,[7]3.9493,[8]3.9195,[9]3.9194,[10]3.9113,[11]3.8870,[12]3.9697,[13]4.0142,[14]4.1557,[15]4.3368,
SKIP  74 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.2411,[2]3.6026,[3]4.1711,[4]4.1165,[5]4.0367,[6]3.9218,[7]3.9889,[8]3.9721,[9]3.9904,[10]3.9994,[11]3.9821,[12]4.0708,[13]4.1186,[14]4.2693,[15]4.4440,
SKIP  75 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.3264,[2]3.6712,[3]4.2471,[4]4.1964,[5]4.1585,[6]4.0247,[7]4.1057,[8]4.0770,[9]4.0958,[10]4.0903,[11]4.0761,[12]4.1485,[13]4.1980,[14]4.3415,[15]4.5233,
SKIP  76 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.3142,[2]3.6864,[3]4.2655,[4]4.2033,[5]4.1426,[6]4.0020,[7]4.1030,[8]4.0911,[9]4.1387,[10]4.1601,[11]4.1510,[12]4.2243,[13]4.2684,[14]4.4145,[15]4.5910,
SKIP  77 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.5910,[2]3.8988,[3]4.4864,[4]4.3919,[5]4.4015,[6]4.2767,[7]4.3382,[8]4.2945,[9]4.3320,[10]4.3610,[11]4.3531,[12]4.4343,[13]4.4820,[14]4.6225,[15]4.8101,
SKIP  78 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.4768,[2]3.8468,[3]4.4286,[4]4.3785,[5]4.3369,[6]4.2433,[7]4.3253,[8]4.3457,[9]4.3725,[10]4.3882,[11]4.3738,[12]4.4581,[13]4.5031,[14]4.6306,[15]4.8193,
SKIP  79 + [22,14,] - len:   3, best:( 62: 0.379)
[1]3.9200,[2]4.1257,[3]5.1031,[4]4.9309,[5]4.7900,[6]4.6561,[7]4.6898,[8]4.6370,[9]4.7301,[10]4.7127,[11]4.6827,[12]4.7756,[13]4.8707,[14]5.0342,[15]5.2726,

ADD SKIP  62 - ppl vs ref 0.3795
SKIP   0 + [22,14,62,] - len:   4, best:( -1: 0.000)
[1]1433.8901,[2]3163.5996,[3]778.9857,[4]960.9986,[5]952.8749,[6]580.2600,[7]418.3652,[8]422.5831,[9]499.7971,[10]395.2115,[11]403.5678,[12]478.9129,[13]517.7435,[14]468.9142,[15]515.3999,
SKIP   1 + [22,14,62,] - len:   4, best:(  0: 511.484)
[1]3.8108,[2]4.3694,[3]5.0679,[4]5.2001,[5]5.0421,[6]4.9113,[7]4.9953,[8]5.0654,[9]5.1530,[10]5.2794,[11]5.3177,[12]5.4134,[13]5.5078,[14]5.6863,[15]5.8794,
SKIP   2 + [22,14,62,] - len:   4, best:(  1: 1.963)
[1]1290.6413,[2]1402.4628,[3]1385.4694,[4]1361.2747,[5]1204.0446,[6]1127.5240,[7]1123.8550,[8]1126.1694,[9]1126.5991,[10]1154.4314,[11]1149.2012,[12]1164.5388,[13]1162.6463,[14]1201.2787,[15]1223.5674,
SKIP   3 + [22,14,62,] - len:   4, best:(  1: 1.963)
[1]3.2346,[2]3.6255,[3]4.2058,[4]4.2694,[5]4.1947,[6]4.0675,[7]4.1586,[8]4.1914,[9]4.2831,[10]4.2981,[11]4.2996,[12]4.3781,[13]4.4816,[14]4.6257,[15]4.8111,
SKIP   4 + [22,14,62,] - len:   4, best:(  3: 0.895)
[1]3.2137,[2]3.5892,[3]4.1285,[4]4.1083,[5]4.0496,[6]3.9527,[7]4.0444,[8]4.0662,[9]4.1100,[10]4.0973,[11]4.1007,[12]4.1674,[13]4.2426,[14]4.3921,[15]4.5723,
SKIP   5 + [22,14,62,] - len:   4, best:(  4: 0.656)
[1]3.2024,[2]3.6082,

@FNsi
Copy link
Contributor

FNsi commented Oct 12, 2023

That's pretty much how it works, except the attention/FFN layers are handled together (rather than being able to skip one of those parts). When you set -ngl it runs layers on CPU first and the the remaining layers (according to what -ngl was set to) get run on the GPU. That's so the data required for the final calculations is conveniently already on the GPU. Doing it the other way around would require copying data to the GPU and then back out again.

Thinking more about this, using that layers in GPU to do the speculation process? So no need anything else but only one huge model & big speeded up!

@KerfuffleV2 KerfuffleV2 marked this pull request as draft October 13, 2023 06:05
@KerfuffleV2
Copy link
Collaborator Author

@BarfingLemurs

Memory question: if you do replicate the draft scheme they showed, are you left with a model only 57.5% the size of original f16 (34 of 80 for the 13B)?

The first thing is this stuff is just talking about skipping running evaluation on those layers, they still get loaded into memory and everything. In the self-speculation stuff, they'd also eventually be used in cases where the draft doesn't get accepted. You could potentially just skip even loading those layers and try to run it like a standalone pared down model. Not really what I'm doing here though.

Also, they said: "[...] You can use attention layers [3, 5, 6, 8, 10, 11, 14, 15, 18, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 33, 34, 35, 36, 37] and MLP layers [6, 9, 10, 11, 15, 24, 25, 27, 28, 35] as a starting point [...]"

In a model, a "layer" is attention + MLP (AKA FFN). So skipping just attention or just MLP is basically only skipping half a layer (assuming the size of the tensors involved is about the same, I don't know that for a fact). If that assumption is correct, skipping a total of 34 MLP or attention layers would be more like skipping 17 complete layers.


@FNsi

Thinking more about this, using that layers in GPU to do the speculation process? So no need anything else but only one huge model & big speeded up!

You mean arrange so the layers used for drafting get loaded onto the GPU even if they're not contiguous in the full model? That's definitely an idea, but I don't know if the overhead of having to copy stuff around would be worth it. (Also, implementing that sort of stuff is beyond my ability at the moment.)

@FNsi
Copy link
Contributor

FNsi commented Oct 13, 2023

I don't know if the overhead of having to copy stuff around would be worth it. (Also, implementing that sort of stuff is beyond my ability at the moment.)

Then I guess there's a easier way, load full model in cpu, and , load demo layers in GPU 😆

@BarfingLemurs
Copy link
Contributor

Full Q2_K 70B base model results: results.txt

Maybe it will be useful for chimera models or smaller quantizations later on.

@KerfuffleV2
Copy link
Collaborator Author

Looks like the people that wrote the self-speculation paper now released code and also some examples of MLP/attention layers to skip: https://github.com/dilab-zju/self-speculative-decoding/blob/main/skip_layers.json (I didn't get a chance to look closely at the code yet.)

@KerfuffleV2
Copy link
Collaborator Author

I finally figured out how to skip MLP and attention layers separately. One weird thing is if skipping MLP or attention (but not both) on the very last layer evaluated, it runs out of space without my hack to force a skip on the last layer when alloc is in measure mode.

The way perplexity works right now isn't ideal, you have to choose what type of skip to use at compile time. This is okay as a proof of concept though.

If I get a chance I want to see if I can implement self speculation on top of GG's tree speculation stuff, I think it might not be too hard with this backend stuff in place. (If anyone else wants to try this, please don't hold back on my account. I'd love for someone to steal my thunder here.)

@KerfuffleV2 KerfuffleV2 force-pushed the feat-skip-layers branch 2 times, most recently from b707b43 to d3c08ea Compare October 17, 2023 17:47
@ggerganov
Copy link
Owner

I like the exploration spirit here :) It should be straightforward to demonstrate self-speculation with what you have by adapting the speculative example.

@KerfuffleV2
Copy link
Collaborator Author

Fun fact: Running this is now twice as slow. Yay.

This also confirms that skipping attention usually works better than skipping MLP layers. At least from what I've seen so far, you can skip a bunch of attention layers even in a really small model like a 3B before skipping an MLP layer is better.

I wanted to modify the speculative example to use the same context for both draft and target but that's actually a bit tricky because of how it uses logits from decoding. I'm sure it's possible, and I actually started along that path but but I think you'd have to do something like save the logits so they don't get overwritten when the target or draft evaluates.

The simplest way to test that is just to load the same model two times and skip stuff in the draft one, not memory efficient but it should demonstrate self-speculation. Once I have some data about good layers to skip for small models like my 3b or a 7b then I'll see about testing that.

@ggerganov
Copy link
Owner

I wanted to modify the speculative example to use the same context for both draft and target but that's actually a bit tricky because of how it uses logits from decoding.

It should work with using 2 separate llama_context with a single model, or maybe I'm missing something?

@KerfuffleV2
Copy link
Collaborator Author

KerfuffleV2 commented Oct 18, 2023

I improved the logic for the perplexity skip layer searching stuff. It'll prune the worst results each pass. As far as I can see, the end result is the same except we get there much, much faster. It'll also abort a test early if the results are absurd.

This last push also adds hacked in support for skipping layers on the draft in speculative (I just took the self-speculative repo's 70B example and applied it to a random 70B model - StellarBright).

By the way, don't try to skip layers on the draft at prompt evaluation time unless you like seeing a 1% acceptance rate.

Results for just speculating against exactly the same model with no skips:

decoded  102 tokens in  167.775 seconds, speed:    0.608 t/s

n_draft   = 16
n_predict = 102
n_drafted = 102
n_accept  = 66
accept    = 64.706%

draft:
llama_print_timings:        eval time =  113915.24 ms /   137 runs   (  831.50 ms per token,     1.20 tokens per second)
llama_print_timings:       total time =  179696.93 ms

target:
llama_print_timings:        eval time =    9116.39 ms /    11 runs   (  828.76 ms per token,     1.21 tokens per second)
llama_print_timings:       total time =  181830.31 ms

Only 65% accepted for the exact same model is kind of disappointing. Like even if we could cut around half of the whole model and get exactly the same drafting accuracy it still would barely break even. Can this really be right?

Anyway, with skipping:

decoded  101 tokens in  136.867 seconds, speed:    0.738 t/s

n_draft   = 2
n_predict = 101
n_drafted = 94
n_accept  = 38
accept    = 40.426%

draft:
llama_print_timings:        eval time =   66857.69 ms /   153 runs   (  436.98 ms per token,     2.29 tokens per second)
llama_print_timings:       total time =  148898.96 ms

target:
llama_print_timings:        eval time =   18263.53 ms /    22 runs   (  830.16 ms per token,     1.20 tokens per second)
llama_print_timings:       total time =  150981.31 ms

It actually does outperform running speculation with an identical draft model and no skips, but it still is worse than just not using speculation at all as far as I can see.

edit: By the way, if you want to try using the results at https://github.com/dilab-zju/self-speculative-decoding/blob/main/skip_layers.json you can generate the skips in Python:

a = tuple([10, 11, 13, 14, 16, 18, 19, 20, 21, 22, 25, 27, 28, 29, 30, 31, 35, 41, 43, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 60, 61, 62, 63, 65, 66, 67, 68, 69, 70, 74, 75, 78, 79])
m = tuple([2, 4, 9, 10, 13, 14, 16, 20, 21, 22, 24, 25, 26, 27, 28, 29, 31, 34, 37, 41, 47, 48, 49, 50, 53, 54, 55, 57, 58, 60, 62, 63, 66, 67, 68, 70, 76])
n_layers = 80
# Populate batch.run_layers with this:
print(tuple((1 if x in a else 0) + (2 if x in m else 0) for x in range(n_layers)))

@KerfuffleV2
Copy link
Collaborator Author

@ggerganov

It should work with using 2 separate llama_context with a single model, or maybe I'm missing something?

Right, it will work but it won't reuse GPU layers or anything. So ideally you could use the same context for both when doing self-speculation. It should be possible just having the draft use a different sequence id, right? (But there's complexity relating to managing the logits after eval so they're available when they need to be.)

llama.cpp Outdated Show resolved Hide resolved
@BarfingLemurs
Copy link
Contributor

BarfingLemurs commented Oct 18, 2023

Only 65% accepted

Here's a comparison with 2 7B f16 base models (on master, using the same model as draft)

-n 128 --top-k 1

"Once upon a time," 66.142%
"Eating 6 burgers" 72.034%
"What is limbo?" 85.437%

If the draft model is Q4_0:
66.667%
81.553% (?)
81.818%

Q4_K_M:
68.103%
75.893%
85.455%

So, smaller models can have higher scores?

@ggerganov
Copy link
Owner

Only 65% accepted for the exact same model is kind of disappointing.

What sampling parameters do you use? You will get the most benefit with greedy sampling and in this scenario of using the same model for drafting, it will result in 100% acceptance rate.

@KerfuffleV2
Copy link
Collaborator Author

What sampling parameters do you use?

I actually was using greedy sampling, seems like the default repetition penalty settings mess it up though. Setting --repeat-last-n 0 does result in 100% acceptance with greedy sampling as expected.

I tweaked the speculation algorithm a bit:

const float skip_scale = 1.50f + std::min(2.0f, 0.75f * float(i)); // 61.76
if (cur_p.data[0].p < skip_scale*cur_p.data[1].p) {
    LOG("stopping drafting, probability too low: %.3f < 2*%.3f\n", cur_p.data[0].p, cur_p.data[1].p);
    break;
}

Was able to get the prediction rate up to 60 when skipping half the 70B draft model layers:

n_draft   = 6
n_predict = 102
n_drafted = 80
n_accept  = 48
accept    = 60.000%

Interestingly it only drops to 57.5% with the repetition penalty on.

@KerfuffleV2
Copy link
Collaborator Author

Oops, I actually didn't mean to add the speculative stuff yet (it is super, super WIP) but I think there is maybe the hint of an interesting idea. I've come to think what it's going to need is normalizing the logits before trying to assess them as fit for drafting. Like just top-k 200 + softmax and use that for some stuff like statistics so there's a more objective way to compare them. Right now the way it works means they can vary a lot between the draft or target model.

Anyway, as for layer skipping I messed with the perplexity tool to allow "anti mode" - start with all but the first/last layers disabled and add them back gradually based on which ones seem most important. I haven't done much testing with this, but maybe it's a way to assess the most important ones in a more time-efficient way.

@KerfuffleV2 KerfuffleV2 changed the title Layer skipping demo Layer skipping/self-speculation demo Oct 20, 2023
@KerfuffleV2
Copy link
Collaborator Author

Here's something kind of interesting. When using a 70B and the recommended layer skips, where would you expect perplexity to end up? This is the reference without any skips:

[1]3.5301,[2]3.6891,[3]4.2059,[4]3.7590,[5]3.5566,[6]3.3557,[7]3.3996,[8]3.3961,[9]3.3747,[10]3.3778

If you said "Maybe double, or at most triple the reference?" you'd be thinking pretty much the same as I did. This is the actual result for running perplexity with those skips:

[1]2819.6070,[2]2992.0594,[3]2609.3665,[4]2545.7050,[5]2514.3705,[6]2315.6226,[7]2308.9518,[8]2329.1486,[9]2358.6234,[10]2454.2212

I find it really surprising that with the model so severely compromised that it can still do anything usable, but it actually does seem to work pretty well as a draft model with those skips.

@FNsi
Copy link
Contributor

FNsi commented Oct 23, 2023

Add layer may produce better results?

(In some case we know models are overfitting, add noise in training can increase the performance, so I wildly guess that will work here too)

@KerfuffleV2
Copy link
Collaborator Author

I don't know if there's really anything much worthwhile in these changes to speculation in terms of actual behavior. This version collects/logs a lot more information though. It can be used without the skip stuff just by commenting out the line that sets run_layers in the batch.

When use self-speculation, it seems to help copying some of the latest KV entries from the full model into the sequences the cut down one is using (this is possible since only one context is used in both cases). Note: It detects that it's running in self-speculation mode just by comparing the --model and --model-draft options. So you can temporarily force using a separate context with something like --model blah.gguf --model-draft ./blah.gguf

I added an approach when I normalize and run top K on the logits for both draft and main model, which allows stuff like enabling greedy sampling and also the logic internally can look at the values the models return in a more objective way.

I also use a different approach to sampling from the draft where I save the logits, run the normal sampling function and then set the probability of picking any token ids that already got picked to -inf. Basically "What's your first choice, and then if you couldn't pick that what would you pick" etc. Pretty sure this isn't currently functional with grammar enabled.

As for results, I've found some strategies that work a little better than master at least for the model I'm testing with and the settings I've used (in other words, could just be random chance). So far, with master or my version I haven't succeeded in reaching a point where it breaks even with just not using speculation at all. For example, generating 128 tokens with the full model takes ~1:57, my best self-speculation result is around 2:28. We also don't even have to evaluate the prompt on the draft model so it should be easier to make self-speculation perform well compared to normal speculation assuming equivalent model sizes. If you can get a 7B predicting a 70B with high accuracy then obviously that's going to be a big speed advantage. Self speculation on a 70B is going to be like using a 35B as the draft model.


times: target predict: 73.287, draft gen/accept/sample: 26.865 / 33.773 / 0.419

n_draft   = 12
n_predict = 129
drafted   = 48.837%
n_drafted = 63
n_accept  = 50
accept    = 79.365%
n_draft   = 12
n_split   = 0
n_effsplit= 0
n_badsplit= 0
n_dupsplit= 0
max streak= 4

Those stats look pretty good but still:

78.54user 2.34system 2:23.98elapsed 681%CPU (0avgtext+0avgdata 43147172maxresident)k

vs not using speculation:

753.29user 2.41system 1:51.24elapsed 679%CPU (0avgtext+0avgdata 43129260maxresident)k

I feel like there's enough info in the log file for someone smarter than me to figure out a better strategy.

STATS: Avg tacc/dacc/drej: 0.68472 / 0.51907 / 0.27794
     | Min dacc/min tacc/max drej: 0.07405 / 0.00491 / 0.79109 
     | delta 0.16431 | max streak 4 | n_dft/pred/acc: 63 / 128 / 50
  • Avg tacc: average target accepted, that's its own normalized (topK 100 + softmax) candidate.p
  • Avg dacc: average drafted token accepted (the draft's normalized candidate)
  • Avg drej: average drafted token rejected (the draft's normalized candidate)
  • delta: The difference between the draft's normalized candidate and the target's (when tokens are accepted)
  • max streak - max successful predictions in a row for a draft pass. Not currently logged, but getting a high average streak is probably the most important thing for good performance.
  • n_dft/pred/acc: Just the number of drafted, target predicted and target accepted tokens so far.

When sampling from the target:

sampling target: s_keep =   0, i_dft =   0, i_batch_tgt =   0
sampled token: 29991: '!'
target sampled (29991, '!') orig_p=0.9218, norm_p=0.4483
Shoulda picked seq   0, pos    0, candidate  0 @ p 0.4022:  29991 '!'

orig_p here is candidate.p when picking with whatever sampling settings the user set. norm_p is with only top-K 100 + softmax.

"Shoulda picked" - we also check and log for candidates the draft suggested but that we didn't actually pick. When the target picks one, you'll see that log message (p here is the normalized one from the draft model).

I had to rewrite the KV cache shuffling stuff to make it work with a shared context. I think there might be something wrong there, even though the model produces coherent results.

These are the sampling settings I used:

--ignore-eos
--repeat-last-n 64
--draft 12 
--temp 0.8 
--top-k 40
-l 0-inf 
-l 1-inf 
-n 128

Model used was stellarbright.Q4_K_M.gguf

(The fact that I saw it pick token id 1 once is what makes me think my KV cache shuffling could have an issue.)

@KerfuffleV2
Copy link
Collaborator Author

Add layer may produce better results?

I'm not completely sure what you mean. Do you mean the "add layers back" mode in perplexity? Maybe, it's hard to tell which approach is going to be better. Also one problem with that compared to removing them from a full model is does it really mean anything when you take a 80 layer model and run just one MLP or attention layer to find perplexity and you see adding layer 1 attention gives 123456.0 perplexity while adding layer 30 MLP gives 111456.0 perplexity? With the adding layers back strategy, we'd add 30 MLP in and say it was "better".

Also, it might seem unintuitive, but adding layers can make perplexity worse (much worse!) sometimes. Also removing layers can make it better (even for the full model) though usually it's not by much.

@cebtenzzre
Copy link
Collaborator

Since this is an interesting demonstration, I'll reopen this for visibility.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
demo Demonstrate some concept or idea, not intended to be merged research 🔬
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants