-
Notifications
You must be signed in to change notification settings - Fork 1
/
flickr_WSDQH_nbits=16_adaMargin_gamma=1_lambda=0.0001_0002.log
861 lines (861 loc) · 101 KB
/
flickr_WSDQH_nbits=16_adaMargin_gamma=1_lambda=0.0001_0002.log
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
2022-10-20 19:54:39,335 prepare dataset.
2022-10-20 19:54:53,632 prepare data loader.
2022-10-20 19:54:53,632 Initializing DataLoader.
2022-10-20 19:54:53,642 DataLoader already.
2022-10-20 19:54:53,642 prepare model.
2022-10-20 19:54:53,861 Number of semantic embeddings: 1178.
2022-10-20 19:55:01,838 From /data/wangjinpeng/anaconda3/envs/py37torch/lib/python3.7/site-packages/tensorflow_core/python/ops/math_grad.py:1424: where (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.where in 2.0, which has the same broadcast rule as np.where.
2022-10-20 19:55:14,517 begin training.
2022-10-20 19:55:34,785 step [ 1], lr [0.0003000], embedding loss [ 0.8958], quantization loss [ 0.0000], 17.73 sec/batch.
2022-10-20 19:55:37,955 step [ 2], lr [0.0003000], embedding loss [ 0.8764], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 19:55:41,212 step [ 3], lr [0.0003000], embedding loss [ 0.8547], quantization loss [ 0.0000], 0.61 sec/batch.
2022-10-20 19:55:44,602 step [ 4], lr [0.0003000], embedding loss [ 0.8475], quantization loss [ 0.0000], 0.63 sec/batch.
2022-10-20 19:55:47,801 step [ 5], lr [0.0003000], embedding loss [ 0.8413], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:55:51,173 step [ 6], lr [0.0003000], embedding loss [ 0.8438], quantization loss [ 0.0000], 0.64 sec/batch.
2022-10-20 19:55:54,544 step [ 7], lr [0.0003000], embedding loss [ 0.8434], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:55:57,783 step [ 8], lr [0.0003000], embedding loss [ 0.8439], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 19:56:01,107 step [ 9], lr [0.0003000], embedding loss [ 0.8390], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:56:04,364 step [ 10], lr [0.0003000], embedding loss [ 0.8447], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:56:07,630 step [ 11], lr [0.0003000], embedding loss [ 0.8405], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:56:10,941 step [ 12], lr [0.0003000], embedding loss [ 0.8352], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:56:14,340 step [ 13], lr [0.0003000], embedding loss [ 0.8496], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:56:17,711 step [ 14], lr [0.0003000], embedding loss [ 0.8453], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 19:56:20,967 step [ 15], lr [0.0003000], embedding loss [ 0.8386], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:56:24,235 step [ 16], lr [0.0003000], embedding loss [ 0.8410], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:56:27,636 step [ 17], lr [0.0003000], embedding loss [ 0.8459], quantization loss [ 0.0000], 0.59 sec/batch.
2022-10-20 19:56:30,882 step [ 18], lr [0.0003000], embedding loss [ 0.8412], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:56:34,192 step [ 19], lr [0.0003000], embedding loss [ 0.8386], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:56:37,503 step [ 20], lr [0.0003000], embedding loss [ 0.8429], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:56:40,816 step [ 21], lr [0.0003000], embedding loss [ 0.8419], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:56:44,037 step [ 22], lr [0.0003000], embedding loss [ 0.8438], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:56:47,259 step [ 23], lr [0.0003000], embedding loss [ 0.8385], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:56:50,581 step [ 24], lr [0.0003000], embedding loss [ 0.8318], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:56:53,952 step [ 25], lr [0.0003000], embedding loss [ 0.8464], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:56:57,176 step [ 26], lr [0.0003000], embedding loss [ 0.8299], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 19:57:00,408 step [ 27], lr [0.0003000], embedding loss [ 0.8335], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:57:03,613 step [ 28], lr [0.0003000], embedding loss [ 0.8340], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:57:06,917 step [ 29], lr [0.0003000], embedding loss [ 0.8371], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:57:10,251 step [ 30], lr [0.0003000], embedding loss [ 0.8336], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:57:13,461 step [ 31], lr [0.0003000], embedding loss [ 0.8416], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:57:16,656 step [ 32], lr [0.0003000], embedding loss [ 0.8440], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:57:20,047 step [ 33], lr [0.0003000], embedding loss [ 0.8289], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:57:23,256 step [ 34], lr [0.0003000], embedding loss [ 0.8348], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:57:26,488 step [ 35], lr [0.0003000], embedding loss [ 0.8416], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 19:57:29,691 step [ 36], lr [0.0003000], embedding loss [ 0.8307], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:57:33,008 step [ 37], lr [0.0003000], embedding loss [ 0.8358], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:57:36,222 step [ 38], lr [0.0003000], embedding loss [ 0.8295], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:57:39,494 step [ 39], lr [0.0003000], embedding loss [ 0.8372], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:57:42,856 step [ 40], lr [0.0003000], embedding loss [ 0.8293], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:57:46,236 step [ 41], lr [0.0003000], embedding loss [ 0.8348], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:57:49,529 step [ 42], lr [0.0003000], embedding loss [ 0.8336], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:57:52,808 step [ 43], lr [0.0003000], embedding loss [ 0.8326], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:57:56,038 step [ 44], lr [0.0003000], embedding loss [ 0.8301], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 19:57:59,241 step [ 45], lr [0.0003000], embedding loss [ 0.8306], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:58:02,564 step [ 46], lr [0.0003000], embedding loss [ 0.8377], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:58:05,896 step [ 47], lr [0.0003000], embedding loss [ 0.8391], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:58:09,208 step [ 48], lr [0.0003000], embedding loss [ 0.8288], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:58:12,457 step [ 49], lr [0.0003000], embedding loss [ 0.8354], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:58:15,588 step [ 50], lr [0.0003000], embedding loss [ 0.8309], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:58:18,827 step [ 51], lr [0.0003000], embedding loss [ 0.8338], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:58:22,181 step [ 52], lr [0.0003000], embedding loss [ 0.8269], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:58:25,556 step [ 53], lr [0.0003000], embedding loss [ 0.8333], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:58:28,648 step [ 54], lr [0.0003000], embedding loss [ 0.8279], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 19:58:31,969 step [ 55], lr [0.0003000], embedding loss [ 0.8340], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:58:35,251 step [ 56], lr [0.0003000], embedding loss [ 0.8278], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:58:38,596 step [ 57], lr [0.0003000], embedding loss [ 0.8324], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:58:41,960 step [ 58], lr [0.0003000], embedding loss [ 0.8321], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 19:58:45,262 step [ 59], lr [0.0003000], embedding loss [ 0.8378], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:58:48,486 step [ 60], lr [0.0003000], embedding loss [ 0.8258], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:58:51,781 step [ 61], lr [0.0003000], embedding loss [ 0.8350], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:58:55,090 step [ 62], lr [0.0003000], embedding loss [ 0.8317], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:58:58,361 step [ 63], lr [0.0003000], embedding loss [ 0.8281], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:59:01,538 step [ 64], lr [0.0003000], embedding loss [ 0.8254], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:04,597 step [ 65], lr [0.0003000], embedding loss [ 0.8282], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 19:59:07,927 step [ 66], lr [0.0003000], embedding loss [ 0.8272], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:11,194 step [ 67], lr [0.0003000], embedding loss [ 0.8286], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 19:59:14,257 step [ 68], lr [0.0003000], embedding loss [ 0.8268], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:17,557 step [ 69], lr [0.0003000], embedding loss [ 0.8281], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:20,681 step [ 70], lr [0.0003000], embedding loss [ 0.8338], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 19:59:23,801 step [ 71], lr [0.0003000], embedding loss [ 0.8304], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:27,013 step [ 72], lr [0.0003000], embedding loss [ 0.8246], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:28,866 step [ 73], lr [0.0003000], embedding loss [ 0.8358], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:30,705 step [ 74], lr [0.0003000], embedding loss [ 0.8351], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:32,539 step [ 75], lr [0.0003000], embedding loss [ 0.8216], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:34,440 step [ 76], lr [0.0003000], embedding loss [ 0.8234], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:59:36,323 step [ 77], lr [0.0003000], embedding loss [ 0.8308], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:38,143 step [ 78], lr [0.0003000], embedding loss [ 0.8213], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:59:40,021 step [ 79], lr [0.0003000], embedding loss [ 0.8205], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:41,905 step [ 80], lr [0.0003000], embedding loss [ 0.8292], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:59:43,760 step [ 81], lr [0.0003000], embedding loss [ 0.8332], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 19:59:45,648 step [ 82], lr [0.0003000], embedding loss [ 0.8277], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:47,541 step [ 83], lr [0.0003000], embedding loss [ 0.8302], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:59:49,376 step [ 84], lr [0.0003000], embedding loss [ 0.8265], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:51,268 step [ 85], lr [0.0003000], embedding loss [ 0.8310], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:53,131 step [ 86], lr [0.0003000], embedding loss [ 0.8229], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:59:55,002 step [ 87], lr [0.0003000], embedding loss [ 0.8315], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 19:59:56,896 step [ 88], lr [0.0003000], embedding loss [ 0.8190], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 19:59:58,785 step [ 89], lr [0.0003000], embedding loss [ 0.8198], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:00:00,687 step [ 90], lr [0.0003000], embedding loss [ 0.8263], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:02,540 step [ 91], lr [0.0003000], embedding loss [ 0.8319], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:04,424 step [ 92], lr [0.0003000], embedding loss [ 0.8297], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:00:06,325 step [ 93], lr [0.0003000], embedding loss [ 0.8188], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:00:08,184 step [ 94], lr [0.0003000], embedding loss [ 0.8311], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:00:10,071 step [ 95], lr [0.0003000], embedding loss [ 0.8281], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:00:11,960 step [ 96], lr [0.0003000], embedding loss [ 0.8251], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:13,847 step [ 97], lr [0.0003000], embedding loss [ 0.8219], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:15,744 step [ 98], lr [0.0003000], embedding loss [ 0.8268], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:17,630 step [ 99], lr [0.0003000], embedding loss [ 0.8227], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:00:19,448 step [ 100], lr [0.0003000], embedding loss [ 0.8247], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:00:21,267 step [ 101], lr [0.0003000], embedding loss [ 0.8187], quantization loss [ 0.0000], 0.54 sec/batch.
2022-10-20 20:00:23,084 step [ 102], lr [0.0003000], embedding loss [ 0.8260], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:00:25,038 step [ 103], lr [0.0003000], embedding loss [ 0.8224], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:26,982 step [ 104], lr [0.0003000], embedding loss [ 0.8213], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:28,883 step [ 105], lr [0.0003000], embedding loss [ 0.8278], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:30,760 step [ 106], lr [0.0003000], embedding loss [ 0.8201], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:00:32,647 step [ 107], lr [0.0003000], embedding loss [ 0.8168], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:34,541 step [ 108], lr [0.0003000], embedding loss [ 0.8251], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:36,441 step [ 109], lr [0.0003000], embedding loss [ 0.8213], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:38,301 step [ 110], lr [0.0003000], embedding loss [ 0.8184], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:00:40,136 step [ 111], lr [0.0003000], embedding loss [ 0.8128], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 20:00:42,073 step [ 112], lr [0.0003000], embedding loss [ 0.8288], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:00:43,949 step [ 113], lr [0.0003000], embedding loss [ 0.8251], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:00:45,848 step [ 114], lr [0.0003000], embedding loss [ 0.8164], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:47,800 step [ 115], lr [0.0003000], embedding loss [ 0.8287], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:49,715 step [ 116], lr [0.0003000], embedding loss [ 0.8266], quantization loss [ 0.0000], 0.58 sec/batch.
2022-10-20 20:00:51,621 step [ 117], lr [0.0003000], embedding loss [ 0.8217], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:53,515 step [ 118], lr [0.0003000], embedding loss [ 0.8232], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:55,408 step [ 119], lr [0.0003000], embedding loss [ 0.8299], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:57,333 step [ 120], lr [0.0003000], embedding loss [ 0.8258], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:00:59,228 step [ 121], lr [0.0003000], embedding loss [ 0.8244], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:01:01,091 step [ 122], lr [0.0003000], embedding loss [ 0.8301], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:01:02,942 step [ 123], lr [0.0003000], embedding loss [ 0.8227], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:01:04,907 step [ 124], lr [0.0003000], embedding loss [ 0.8249], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:01:06,818 step [ 125], lr [0.0003000], embedding loss [ 0.8208], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:01:08,663 step [ 126], lr [0.0003000], embedding loss [ 0.8232], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:01:10,544 step [ 127], lr [0.0003000], embedding loss [ 0.8263], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:01:12,493 step [ 128], lr [0.0003000], embedding loss [ 0.8234], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:01:14,312 step [ 129], lr [0.0003000], embedding loss [ 0.8270], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:01:16,201 step [ 130], lr [0.0003000], embedding loss [ 0.8228], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:01:18,081 step [ 131], lr [0.0003000], embedding loss [ 0.8219], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:01:19,969 step [ 132], lr [0.0003000], embedding loss [ 0.8343], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:01:21,901 step [ 133], lr [0.0003000], embedding loss [ 0.8212], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:01:23,768 step [ 134], lr [0.0003000], embedding loss [ 0.8215], quantization loss [ 0.0000], 0.55 sec/batch.
2022-10-20 20:01:25,632 step [ 135], lr [0.0003000], embedding loss [ 0.8292], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:01:27,530 step [ 136], lr [0.0003000], embedding loss [ 0.8199], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:01:29,449 step [ 137], lr [0.0003000], embedding loss [ 0.8268], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:01:31,389 step [ 138], lr [0.0003000], embedding loss [ 0.8307], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:01:33,284 step [ 139], lr [0.0003000], embedding loss [ 0.8198], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:01:35,121 step [ 140], lr [0.0003000], embedding loss [ 0.8196], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:01:37,053 step [ 141], lr [0.0003000], embedding loss [ 0.8231], quantization loss [ 0.0000], 0.57 sec/batch.
2022-10-20 20:01:38,964 step [ 142], lr [0.0003000], embedding loss [ 0.8208], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:01:40,901 step [ 143], lr [0.0003000], embedding loss [ 0.8228], quantization loss [ 0.0000], 0.56 sec/batch.
2022-10-20 20:01:51,186 step [ 144], lr [0.0003000], embedding loss [ 0.8182], quantization loss [ 0.0000], 1.25 sec/batch.
2022-10-20 20:01:53,540 step [ 145], lr [0.0003000], embedding loss [ 0.8175], quantization loss [ 0.0000], 1.08 sec/batch.
2022-10-20 20:01:55,923 step [ 146], lr [0.0003000], embedding loss [ 0.8222], quantization loss [ 0.0000], 1.08 sec/batch.
2022-10-20 20:01:58,278 step [ 147], lr [0.0003000], embedding loss [ 0.8359], quantization loss [ 0.0000], 1.07 sec/batch.
2022-10-20 20:01:58,279 initialize centers iter(1/1).
2022-10-20 20:02:06,870 finish center initialization, duration: 8.59 sec.
2022-10-20 20:02:06,871 update codes and centers iter(1/1).
2022-10-20 20:02:11,582 number of update_code wrong: 71.
2022-10-20 20:02:15,154 non zero codewords: 512.
2022-10-20 20:02:15,154 finish center update, duration: 8.28 sec.
2022-10-20 20:02:17,515 step [ 148], lr [0.0003000], embedding loss [ 0.8117], quantization loss [ 0.1904], 1.09 sec/batch.
2022-10-20 20:02:19,891 step [ 149], lr [0.0003000], embedding loss [ 0.8364], quantization loss [ 0.3486], 1.09 sec/batch.
2022-10-20 20:02:22,300 step [ 150], lr [0.0003000], embedding loss [ 0.8176], quantization loss [ 0.1788], 1.07 sec/batch.
2022-10-20 20:02:24,819 step [ 151], lr [0.0003000], embedding loss [ 0.8237], quantization loss [ 0.2097], 1.20 sec/batch.
2022-10-20 20:02:27,221 step [ 152], lr [0.0003000], embedding loss [ 0.8179], quantization loss [ 0.2149], 1.07 sec/batch.
2022-10-20 20:02:29,572 step [ 153], lr [0.0003000], embedding loss [ 0.8329], quantization loss [ 0.1531], 1.04 sec/batch.
2022-10-20 20:02:31,935 step [ 154], lr [0.0003000], embedding loss [ 0.8342], quantization loss [ 0.1596], 1.06 sec/batch.
2022-10-20 20:02:34,386 step [ 155], lr [0.0003000], embedding loss [ 0.8330], quantization loss [ 0.1735], 1.19 sec/batch.
2022-10-20 20:02:36,788 step [ 156], lr [0.0003000], embedding loss [ 0.8175], quantization loss [ 0.1507], 1.10 sec/batch.
2022-10-20 20:02:39,113 step [ 157], lr [0.0003000], embedding loss [ 0.8235], quantization loss [ 0.1389], 1.02 sec/batch.
2022-10-20 20:02:41,476 step [ 158], lr [0.0003000], embedding loss [ 0.8285], quantization loss [ 0.1340], 1.08 sec/batch.
2022-10-20 20:02:43,915 step [ 159], lr [0.0003000], embedding loss [ 0.8213], quantization loss [ 0.1308], 1.18 sec/batch.
2022-10-20 20:02:46,297 step [ 160], lr [0.0003000], embedding loss [ 0.8301], quantization loss [ 0.1224], 1.10 sec/batch.
2022-10-20 20:02:48,647 step [ 161], lr [0.0003000], embedding loss [ 0.8232], quantization loss [ 0.1323], 1.01 sec/batch.
2022-10-20 20:02:50,966 step [ 162], lr [0.0003000], embedding loss [ 0.8284], quantization loss [ 0.1242], 1.06 sec/batch.
2022-10-20 20:02:53,481 step [ 163], lr [0.0003000], embedding loss [ 0.8224], quantization loss [ 0.1231], 1.21 sec/batch.
2022-10-20 20:02:55,773 step [ 164], lr [0.0003000], embedding loss [ 0.8216], quantization loss [ 0.1238], 1.07 sec/batch.
2022-10-20 20:02:58,183 step [ 165], lr [0.0003000], embedding loss [ 0.8296], quantization loss [ 0.1134], 1.05 sec/batch.
2022-10-20 20:03:00,695 step [ 166], lr [0.0003000], embedding loss [ 0.8296], quantization loss [ 0.1057], 1.24 sec/batch.
2022-10-20 20:03:03,049 step [ 167], lr [0.0003000], embedding loss [ 0.8290], quantization loss [ 0.1100], 1.07 sec/batch.
2022-10-20 20:03:05,453 step [ 168], lr [0.0003000], embedding loss [ 0.8186], quantization loss [ 0.1147], 1.11 sec/batch.
2022-10-20 20:03:07,763 step [ 169], lr [0.0003000], embedding loss [ 0.8246], quantization loss [ 0.1152], 1.04 sec/batch.
2022-10-20 20:03:10,309 step [ 170], lr [0.0003000], embedding loss [ 0.8217], quantization loss [ 0.1036], 1.24 sec/batch.
2022-10-20 20:03:12,688 step [ 171], lr [0.0003000], embedding loss [ 0.8336], quantization loss [ 0.1129], 1.07 sec/batch.
2022-10-20 20:03:15,031 step [ 172], lr [0.0003000], embedding loss [ 0.8245], quantization loss [ 0.1092], 1.04 sec/batch.
2022-10-20 20:03:17,380 step [ 173], lr [0.0003000], embedding loss [ 0.8201], quantization loss [ 0.1142], 1.04 sec/batch.
2022-10-20 20:03:19,922 step [ 174], lr [0.0003000], embedding loss [ 0.8197], quantization loss [ 0.1032], 1.26 sec/batch.
2022-10-20 20:03:22,303 step [ 175], lr [0.0003000], embedding loss [ 0.8292], quantization loss [ 0.1031], 1.07 sec/batch.
2022-10-20 20:03:24,705 step [ 176], lr [0.0003000], embedding loss [ 0.8298], quantization loss [ 0.0978], 1.10 sec/batch.
2022-10-20 20:03:27,031 step [ 177], lr [0.0003000], embedding loss [ 0.8180], quantization loss [ 0.1181], 1.03 sec/batch.
2022-10-20 20:03:29,578 step [ 178], lr [0.0003000], embedding loss [ 0.8225], quantization loss [ 0.0944], 1.27 sec/batch.
2022-10-20 20:03:32,015 step [ 179], lr [0.0003000], embedding loss [ 0.8202], quantization loss [ 0.1065], 1.08 sec/batch.
2022-10-20 20:03:34,410 step [ 180], lr [0.0003000], embedding loss [ 0.8252], quantization loss [ 0.1064], 1.07 sec/batch.
2022-10-20 20:03:36,767 step [ 181], lr [0.0003000], embedding loss [ 0.8273], quantization loss [ 0.1043], 1.05 sec/batch.
2022-10-20 20:03:39,280 step [ 182], lr [0.0003000], embedding loss [ 0.8206], quantization loss [ 0.1179], 1.25 sec/batch.
2022-10-20 20:03:41,632 step [ 183], lr [0.0003000], embedding loss [ 0.8212], quantization loss [ 0.1016], 1.05 sec/batch.
2022-10-20 20:03:43,962 step [ 184], lr [0.0003000], embedding loss [ 0.8243], quantization loss [ 0.1072], 1.05 sec/batch.
2022-10-20 20:03:46,470 step [ 185], lr [0.0003000], embedding loss [ 0.8204], quantization loss [ 0.1136], 1.20 sec/batch.
2022-10-20 20:03:48,867 step [ 186], lr [0.0003000], embedding loss [ 0.8274], quantization loss [ 0.0997], 1.10 sec/batch.
2022-10-20 20:03:51,172 step [ 187], lr [0.0003000], embedding loss [ 0.8279], quantization loss [ 0.1124], 1.03 sec/batch.
2022-10-20 20:03:53,514 step [ 188], lr [0.0003000], embedding loss [ 0.8218], quantization loss [ 0.1033], 1.08 sec/batch.
2022-10-20 20:03:56,043 step [ 189], lr [0.0003000], embedding loss [ 0.8261], quantization loss [ 0.0959], 1.22 sec/batch.
2022-10-20 20:03:58,384 step [ 190], lr [0.0003000], embedding loss [ 0.8244], quantization loss [ 0.1092], 1.07 sec/batch.
2022-10-20 20:04:00,724 step [ 191], lr [0.0003000], embedding loss [ 0.8155], quantization loss [ 0.0950], 1.05 sec/batch.
2022-10-20 20:04:03,054 step [ 192], lr [0.0003000], embedding loss [ 0.8140], quantization loss [ 0.0977], 1.06 sec/batch.
2022-10-20 20:04:05,588 step [ 193], lr [0.0003000], embedding loss [ 0.8170], quantization loss [ 0.1044], 1.21 sec/batch.
2022-10-20 20:04:07,974 step [ 194], lr [0.0003000], embedding loss [ 0.8304], quantization loss [ 0.1015], 1.09 sec/batch.
2022-10-20 20:04:10,349 step [ 195], lr [0.0003000], embedding loss [ 0.8219], quantization loss [ 0.1091], 1.06 sec/batch.
2022-10-20 20:04:12,745 step [ 196], lr [0.0003000], embedding loss [ 0.8226], quantization loss [ 0.1003], 1.09 sec/batch.
2022-10-20 20:04:15,223 step [ 197], lr [0.0003000], embedding loss [ 0.8235], quantization loss [ 0.0851], 1.18 sec/batch.
2022-10-20 20:04:17,649 step [ 198], lr [0.0003000], embedding loss [ 0.8300], quantization loss [ 0.0807], 1.12 sec/batch.
2022-10-20 20:04:20,024 step [ 199], lr [0.0003000], embedding loss [ 0.8233], quantization loss [ 0.0913], 1.06 sec/batch.
2022-10-20 20:04:22,550 step [ 200], lr [0.0003000], embedding loss [ 0.8227], quantization loss [ 0.0963], 1.24 sec/batch.
2022-10-20 20:04:24,921 step [ 201], lr [0.0003000], embedding loss [ 0.8222], quantization loss [ 0.0879], 1.08 sec/batch.
2022-10-20 20:04:27,320 step [ 202], lr [0.0003000], embedding loss [ 0.8158], quantization loss [ 0.0881], 1.09 sec/batch.
2022-10-20 20:04:29,747 step [ 203], lr [0.0003000], embedding loss [ 0.8297], quantization loss [ 0.1037], 1.07 sec/batch.
2022-10-20 20:04:32,268 step [ 204], lr [0.0003000], embedding loss [ 0.8291], quantization loss [ 0.0886], 1.22 sec/batch.
2022-10-20 20:04:34,650 step [ 205], lr [0.0003000], embedding loss [ 0.8250], quantization loss [ 0.0872], 1.07 sec/batch.
2022-10-20 20:04:37,031 step [ 206], lr [0.0003000], embedding loss [ 0.8164], quantization loss [ 0.0975], 1.08 sec/batch.
2022-10-20 20:04:39,378 step [ 207], lr [0.0003000], embedding loss [ 0.8241], quantization loss [ 0.0843], 1.05 sec/batch.
2022-10-20 20:04:41,937 step [ 208], lr [0.0003000], embedding loss [ 0.8228], quantization loss [ 0.0849], 1.26 sec/batch.
2022-10-20 20:04:44,338 step [ 209], lr [0.0003000], embedding loss [ 0.8105], quantization loss [ 0.0855], 1.08 sec/batch.
2022-10-20 20:04:46,709 step [ 210], lr [0.0003000], embedding loss [ 0.8155], quantization loss [ 0.0846], 1.07 sec/batch.
2022-10-20 20:04:49,073 step [ 211], lr [0.0003000], embedding loss [ 0.8204], quantization loss [ 0.1080], 1.05 sec/batch.
2022-10-20 20:04:51,587 step [ 212], lr [0.0003000], embedding loss [ 0.8262], quantization loss [ 0.0869], 1.21 sec/batch.
2022-10-20 20:04:53,972 step [ 213], lr [0.0003000], embedding loss [ 0.8143], quantization loss [ 0.0944], 1.06 sec/batch.
2022-10-20 20:04:56,338 step [ 214], lr [0.0003000], embedding loss [ 0.8238], quantization loss [ 0.0887], 1.08 sec/batch.
2022-10-20 20:04:58,809 step [ 215], lr [0.0003000], embedding loss [ 0.8135], quantization loss [ 0.0909], 1.19 sec/batch.
2022-10-20 20:05:01,211 step [ 216], lr [0.0003000], embedding loss [ 0.8301], quantization loss [ 0.0852], 1.11 sec/batch.
2022-10-20 20:05:03,584 step [ 217], lr [0.0003000], embedding loss [ 0.8212], quantization loss [ 0.0889], 1.03 sec/batch.
2022-10-20 20:05:05,959 step [ 218], lr [0.0003000], embedding loss [ 0.8200], quantization loss [ 0.1001], 1.09 sec/batch.
2022-10-20 20:05:08,463 step [ 219], lr [0.0003000], embedding loss [ 0.8173], quantization loss [ 0.0987], 1.21 sec/batch.
2022-10-20 20:05:10,869 step [ 220], lr [0.0003000], embedding loss [ 0.8171], quantization loss [ 0.0874], 1.10 sec/batch.
2022-10-20 20:05:13,268 step [ 221], lr [0.0003000], embedding loss [ 0.8198], quantization loss [ 0.0853], 1.08 sec/batch.
2022-10-20 20:05:15,630 step [ 222], lr [0.0003000], embedding loss [ 0.8250], quantization loss [ 0.1028], 1.05 sec/batch.
2022-10-20 20:05:18,163 step [ 223], lr [0.0003000], embedding loss [ 0.8228], quantization loss [ 0.0967], 1.22 sec/batch.
2022-10-20 20:05:20,637 step [ 224], lr [0.0003000], embedding loss [ 0.8297], quantization loss [ 0.0872], 1.16 sec/batch.
2022-10-20 20:05:23,003 step [ 225], lr [0.0003000], embedding loss [ 0.8239], quantization loss [ 0.0863], 1.03 sec/batch.
2022-10-20 20:05:25,813 step [ 226], lr [0.0003000], embedding loss [ 0.8266], quantization loss [ 0.0856], 1.21 sec/batch.
2022-10-20 20:05:28,359 step [ 227], lr [0.0003000], embedding loss [ 0.8204], quantization loss [ 0.0893], 1.23 sec/batch.
2022-10-20 20:05:30,760 step [ 228], lr [0.0003000], embedding loss [ 0.8215], quantization loss [ 0.0864], 1.07 sec/batch.
2022-10-20 20:05:33,133 step [ 229], lr [0.0003000], embedding loss [ 0.8194], quantization loss [ 0.0824], 1.05 sec/batch.
2022-10-20 20:05:35,651 step [ 230], lr [0.0003000], embedding loss [ 0.8169], quantization loss [ 0.0894], 1.23 sec/batch.
2022-10-20 20:05:38,039 step [ 231], lr [0.0003000], embedding loss [ 0.8266], quantization loss [ 0.0921], 1.07 sec/batch.
2022-10-20 20:05:40,498 step [ 232], lr [0.0003000], embedding loss [ 0.8198], quantization loss [ 0.0804], 1.10 sec/batch.
2022-10-20 20:05:42,866 step [ 233], lr [0.0003000], embedding loss [ 0.8300], quantization loss [ 0.0865], 1.03 sec/batch.
2022-10-20 20:05:45,417 step [ 234], lr [0.0003000], embedding loss [ 0.8171], quantization loss [ 0.0905], 1.24 sec/batch.
2022-10-20 20:05:47,832 step [ 235], lr [0.0003000], embedding loss [ 0.8200], quantization loss [ 0.0815], 1.07 sec/batch.
2022-10-20 20:05:50,278 step [ 236], lr [0.0003000], embedding loss [ 0.8209], quantization loss [ 0.0841], 1.09 sec/batch.
2022-10-20 20:05:52,651 step [ 237], lr [0.0003000], embedding loss [ 0.8245], quantization loss [ 0.0867], 1.06 sec/batch.
2022-10-20 20:05:55,156 step [ 238], lr [0.0003000], embedding loss [ 0.8214], quantization loss [ 0.0856], 1.21 sec/batch.
2022-10-20 20:05:57,602 step [ 239], lr [0.0003000], embedding loss [ 0.8192], quantization loss [ 0.0800], 1.11 sec/batch.
2022-10-20 20:06:00,048 step [ 240], lr [0.0003000], embedding loss [ 0.8104], quantization loss [ 0.1063], 1.08 sec/batch.
2022-10-20 20:06:02,408 step [ 241], lr [0.0003000], embedding loss [ 0.8227], quantization loss [ 0.0821], 1.05 sec/batch.
2022-10-20 20:06:04,951 step [ 242], lr [0.0003000], embedding loss [ 0.8269], quantization loss [ 0.0905], 1.25 sec/batch.
2022-10-20 20:06:07,267 step [ 243], lr [0.0003000], embedding loss [ 0.8194], quantization loss [ 0.0849], 1.03 sec/batch.
2022-10-20 20:06:09,645 step [ 244], lr [0.0003000], embedding loss [ 0.8254], quantization loss [ 0.0959], 1.08 sec/batch.
2022-10-20 20:06:11,990 step [ 245], lr [0.0003000], embedding loss [ 0.8189], quantization loss [ 0.0990], 1.04 sec/batch.
2022-10-20 20:06:14,515 step [ 246], lr [0.0003000], embedding loss [ 0.8192], quantization loss [ 0.0850], 1.23 sec/batch.
2022-10-20 20:06:16,948 step [ 247], lr [0.0003000], embedding loss [ 0.8175], quantization loss [ 0.0835], 1.08 sec/batch.
2022-10-20 20:06:19,346 step [ 248], lr [0.0003000], embedding loss [ 0.8185], quantization loss [ 0.0861], 1.07 sec/batch.
2022-10-20 20:06:21,857 step [ 249], lr [0.0003000], embedding loss [ 0.8271], quantization loss [ 0.0826], 1.21 sec/batch.
2022-10-20 20:06:24,195 step [ 250], lr [0.0003000], embedding loss [ 0.8182], quantization loss [ 0.1004], 1.06 sec/batch.
2022-10-20 20:06:26,453 step [ 251], lr [0.0003000], embedding loss [ 0.8163], quantization loss [ 0.0766], 1.00 sec/batch.
2022-10-20 20:06:28,828 step [ 252], lr [0.0003000], embedding loss [ 0.8161], quantization loss [ 0.0869], 1.09 sec/batch.
2022-10-20 20:06:31,260 step [ 253], lr [0.0003000], embedding loss [ 0.8197], quantization loss [ 0.0970], 1.18 sec/batch.
2022-10-20 20:06:33,668 step [ 254], lr [0.0003000], embedding loss [ 0.8355], quantization loss [ 0.0891], 1.11 sec/batch.
2022-10-20 20:06:35,991 step [ 255], lr [0.0003000], embedding loss [ 0.8254], quantization loss [ 0.0869], 1.02 sec/batch.
2022-10-20 20:06:38,391 step [ 256], lr [0.0003000], embedding loss [ 0.8249], quantization loss [ 0.0875], 1.10 sec/batch.
2022-10-20 20:06:40,945 step [ 257], lr [0.0003000], embedding loss [ 0.8170], quantization loss [ 0.0961], 1.23 sec/batch.
2022-10-20 20:06:43,367 step [ 258], lr [0.0003000], embedding loss [ 0.8321], quantization loss [ 0.0811], 1.11 sec/batch.
2022-10-20 20:06:45,753 step [ 259], lr [0.0003000], embedding loss [ 0.8259], quantization loss [ 0.1044], 1.06 sec/batch.
2022-10-20 20:06:48,139 step [ 260], lr [0.0003000], embedding loss [ 0.8145], quantization loss [ 0.0888], 1.09 sec/batch.
2022-10-20 20:06:50,703 step [ 261], lr [0.0003000], embedding loss [ 0.8171], quantization loss [ 0.0810], 1.22 sec/batch.
2022-10-20 20:06:53,156 step [ 262], lr [0.0003000], embedding loss [ 0.8173], quantization loss [ 0.0831], 1.12 sec/batch.
2022-10-20 20:06:55,541 step [ 263], lr [0.0003000], embedding loss [ 0.8305], quantization loss [ 0.0885], 1.06 sec/batch.
2022-10-20 20:06:58,133 step [ 264], lr [0.0003000], embedding loss [ 0.8124], quantization loss [ 0.0836], 1.28 sec/batch.
2022-10-20 20:07:00,548 step [ 265], lr [0.0003000], embedding loss [ 0.8214], quantization loss [ 0.0846], 1.09 sec/batch.
2022-10-20 20:07:02,951 step [ 266], lr [0.0003000], embedding loss [ 0.8132], quantization loss [ 0.0934], 1.07 sec/batch.
2022-10-20 20:07:05,391 step [ 267], lr [0.0003000], embedding loss [ 0.8145], quantization loss [ 0.0782], 1.07 sec/batch.
2022-10-20 20:07:07,979 step [ 268], lr [0.0003000], embedding loss [ 0.8152], quantization loss [ 0.0906], 1.26 sec/batch.
2022-10-20 20:07:10,440 step [ 269], lr [0.0003000], embedding loss [ 0.8212], quantization loss [ 0.0868], 1.08 sec/batch.
2022-10-20 20:07:12,864 step [ 270], lr [0.0003000], embedding loss [ 0.8157], quantization loss [ 0.0818], 1.09 sec/batch.
2022-10-20 20:07:15,194 step [ 271], lr [0.0003000], embedding loss [ 0.8219], quantization loss [ 0.0878], 1.02 sec/batch.
2022-10-20 20:07:17,765 step [ 272], lr [0.0003000], embedding loss [ 0.8250], quantization loss [ 0.0942], 1.26 sec/batch.
2022-10-20 20:07:20,174 step [ 273], lr [0.0003000], embedding loss [ 0.8246], quantization loss [ 0.0914], 1.08 sec/batch.
2022-10-20 20:07:22,556 step [ 274], lr [0.0003000], embedding loss [ 0.8171], quantization loss [ 0.0868], 1.08 sec/batch.
2022-10-20 20:07:24,922 step [ 275], lr [0.0003000], embedding loss [ 0.8198], quantization loss [ 0.0818], 1.06 sec/batch.
2022-10-20 20:07:27,431 step [ 276], lr [0.0003000], embedding loss [ 0.8151], quantization loss [ 0.0875], 1.24 sec/batch.
2022-10-20 20:07:29,826 step [ 277], lr [0.0003000], embedding loss [ 0.8177], quantization loss [ 0.0878], 1.08 sec/batch.
2022-10-20 20:07:32,225 step [ 278], lr [0.0003000], embedding loss [ 0.8225], quantization loss [ 0.0767], 1.09 sec/batch.
2022-10-20 20:07:34,783 step [ 279], lr [0.0003000], embedding loss [ 0.8104], quantization loss [ 0.0904], 1.21 sec/batch.
2022-10-20 20:07:37,211 step [ 280], lr [0.0003000], embedding loss [ 0.8245], quantization loss [ 0.0865], 1.11 sec/batch.
2022-10-20 20:07:39,570 step [ 281], lr [0.0003000], embedding loss [ 0.8266], quantization loss [ 0.0809], 1.03 sec/batch.
2022-10-20 20:07:41,998 step [ 282], lr [0.0003000], embedding loss [ 0.8245], quantization loss [ 0.0791], 1.12 sec/batch.
2022-10-20 20:07:44,557 step [ 283], lr [0.0003000], embedding loss [ 0.8298], quantization loss [ 0.0893], 1.22 sec/batch.
2022-10-20 20:07:47,007 step [ 284], lr [0.0003000], embedding loss [ 0.8263], quantization loss [ 0.0966], 1.11 sec/batch.
2022-10-20 20:07:49,405 step [ 285], lr [0.0003000], embedding loss [ 0.8194], quantization loss [ 0.0924], 1.05 sec/batch.
2022-10-20 20:07:51,813 step [ 286], lr [0.0003000], embedding loss [ 0.8205], quantization loss [ 0.0885], 1.09 sec/batch.
2022-10-20 20:07:54,349 step [ 287], lr [0.0003000], embedding loss [ 0.8191], quantization loss [ 0.1073], 1.21 sec/batch.
2022-10-20 20:07:56,786 step [ 288], lr [0.0003000], embedding loss [ 0.8251], quantization loss [ 0.0929], 1.12 sec/batch.
2022-10-20 20:07:59,133 step [ 289], lr [0.0003000], embedding loss [ 0.8190], quantization loss [ 0.0979], 1.02 sec/batch.
2022-10-20 20:08:01,815 step [ 290], lr [0.0003000], embedding loss [ 0.8223], quantization loss [ 0.0800], 1.21 sec/batch.
2022-10-20 20:08:04,381 step [ 291], lr [0.0003000], embedding loss [ 0.8167], quantization loss [ 0.0862], 1.25 sec/batch.
2022-10-20 20:08:06,880 step [ 292], lr [0.0003000], embedding loss [ 0.8201], quantization loss [ 0.0851], 1.10 sec/batch.
2022-10-20 20:08:09,290 step [ 293], lr [0.0003000], embedding loss [ 0.8229], quantization loss [ 0.1007], 1.08 sec/batch.
2022-10-20 20:08:09,290 update codes and centers iter(1/1).
2022-10-20 20:08:13,240 number of update_code wrong: 0.
2022-10-20 20:08:15,723 non zero codewords: 512.
2022-10-20 20:08:15,723 finish center update, duration: 6.43 sec.
2022-10-20 20:08:18,189 step [ 294], lr [0.0003000], embedding loss [ 0.8128], quantization loss [ 0.0292], 1.23 sec/batch.
2022-10-20 20:08:20,601 step [ 295], lr [0.0003000], embedding loss [ 0.8200], quantization loss [ 0.0310], 1.08 sec/batch.
2022-10-20 20:08:22,997 step [ 296], lr [0.0003000], embedding loss [ 0.8234], quantization loss [ 0.0316], 1.07 sec/batch.
2022-10-20 20:08:25,450 step [ 297], lr [0.0003000], embedding loss [ 0.8122], quantization loss [ 0.0265], 1.06 sec/batch.
2022-10-20 20:08:28,555 step [ 298], lr [0.0003000], embedding loss [ 0.8230], quantization loss [ 0.0254], 1.71 sec/batch.
2022-10-20 20:08:31,665 step [ 299], lr [0.0003000], embedding loss [ 0.8227], quantization loss [ 0.0264], 1.76 sec/batch.
2022-10-20 20:08:34,487 step [ 300], lr [0.0003000], embedding loss [ 0.8200], quantization loss [ 0.0290], 1.54 sec/batch.
2022-10-20 20:08:37,228 step [ 301], lr [0.0001500], embedding loss [ 0.8161], quantization loss [ 0.0272], 1.44 sec/batch.
2022-10-20 20:08:40,046 step [ 302], lr [0.0001500], embedding loss [ 0.8201], quantization loss [ 0.0269], 1.52 sec/batch.
2022-10-20 20:08:42,835 step [ 303], lr [0.0001500], embedding loss [ 0.8222], quantization loss [ 0.0264], 1.46 sec/batch.
2022-10-20 20:08:45,657 step [ 304], lr [0.0001500], embedding loss [ 0.8277], quantization loss [ 0.0257], 1.52 sec/batch.
2022-10-20 20:08:48,442 step [ 305], lr [0.0001500], embedding loss [ 0.8231], quantization loss [ 0.0258], 1.46 sec/batch.
2022-10-20 20:08:51,243 step [ 306], lr [0.0001500], embedding loss [ 0.8233], quantization loss [ 0.0283], 1.49 sec/batch.
2022-10-20 20:08:54,197 step [ 307], lr [0.0001500], embedding loss [ 0.8214], quantization loss [ 0.0278], 1.45 sec/batch.
2022-10-20 20:08:56,943 step [ 308], lr [0.0001500], embedding loss [ 0.8296], quantization loss [ 0.0253], 1.45 sec/batch.
2022-10-20 20:08:59,709 step [ 309], lr [0.0001500], embedding loss [ 0.8228], quantization loss [ 0.0223], 1.45 sec/batch.
2022-10-20 20:09:02,479 step [ 310], lr [0.0001500], embedding loss [ 0.8218], quantization loss [ 0.0262], 1.47 sec/batch.
2022-10-20 20:09:05,261 step [ 311], lr [0.0001500], embedding loss [ 0.8234], quantization loss [ 0.0238], 1.44 sec/batch.
2022-10-20 20:09:08,087 step [ 312], lr [0.0001500], embedding loss [ 0.8200], quantization loss [ 0.0241], 1.49 sec/batch.
2022-10-20 20:09:10,802 step [ 313], lr [0.0001500], embedding loss [ 0.8203], quantization loss [ 0.0237], 1.41 sec/batch.
2022-10-20 20:09:13,583 step [ 314], lr [0.0001500], embedding loss [ 0.8213], quantization loss [ 0.0247], 1.49 sec/batch.
2022-10-20 20:09:16,337 step [ 315], lr [0.0001500], embedding loss [ 0.8162], quantization loss [ 0.0236], 1.45 sec/batch.
2022-10-20 20:09:19,074 step [ 316], lr [0.0001500], embedding loss [ 0.8217], quantization loss [ 0.0276], 1.44 sec/batch.
2022-10-20 20:09:21,835 step [ 317], lr [0.0001500], embedding loss [ 0.8169], quantization loss [ 0.0235], 1.45 sec/batch.
2022-10-20 20:09:24,629 step [ 318], lr [0.0001500], embedding loss [ 0.8248], quantization loss [ 0.0220], 1.50 sec/batch.
2022-10-20 20:09:27,436 step [ 319], lr [0.0001500], embedding loss [ 0.8193], quantization loss [ 0.0239], 1.48 sec/batch.
2022-10-20 20:09:30,272 step [ 320], lr [0.0001500], embedding loss [ 0.8215], quantization loss [ 0.0234], 1.48 sec/batch.
2022-10-20 20:09:33,075 step [ 321], lr [0.0001500], embedding loss [ 0.8272], quantization loss [ 0.0251], 1.46 sec/batch.
2022-10-20 20:09:35,861 step [ 322], lr [0.0001500], embedding loss [ 0.8173], quantization loss [ 0.0227], 1.46 sec/batch.
2022-10-20 20:09:38,637 step [ 323], lr [0.0001500], embedding loss [ 0.8248], quantization loss [ 0.0232], 1.45 sec/batch.
2022-10-20 20:09:41,477 step [ 324], lr [0.0001500], embedding loss [ 0.8167], quantization loss [ 0.0245], 1.50 sec/batch.
2022-10-20 20:09:44,264 step [ 325], lr [0.0001500], embedding loss [ 0.8188], quantization loss [ 0.0228], 1.44 sec/batch.
2022-10-20 20:09:47,081 step [ 326], lr [0.0001500], embedding loss [ 0.8116], quantization loss [ 0.0232], 1.48 sec/batch.
2022-10-20 20:09:49,864 step [ 327], lr [0.0001500], embedding loss [ 0.8249], quantization loss [ 0.0245], 1.45 sec/batch.
2022-10-20 20:09:52,655 step [ 328], lr [0.0001500], embedding loss [ 0.8274], quantization loss [ 0.0226], 1.48 sec/batch.
2022-10-20 20:09:55,418 step [ 329], lr [0.0001500], embedding loss [ 0.8176], quantization loss [ 0.0216], 1.44 sec/batch.
2022-10-20 20:09:58,201 step [ 330], lr [0.0001500], embedding loss [ 0.8226], quantization loss [ 0.0235], 1.48 sec/batch.
2022-10-20 20:10:00,995 step [ 331], lr [0.0001500], embedding loss [ 0.8269], quantization loss [ 0.0243], 1.46 sec/batch.
2022-10-20 20:10:03,812 step [ 332], lr [0.0001500], embedding loss [ 0.8320], quantization loss [ 0.0230], 1.47 sec/batch.
2022-10-20 20:10:06,639 step [ 333], lr [0.0001500], embedding loss [ 0.8223], quantization loss [ 0.0216], 1.43 sec/batch.
2022-10-20 20:10:09,372 step [ 334], lr [0.0001500], embedding loss [ 0.8215], quantization loss [ 0.0207], 1.45 sec/batch.
2022-10-20 20:10:12,109 step [ 335], lr [0.0001500], embedding loss [ 0.8179], quantization loss [ 0.0236], 1.42 sec/batch.
2022-10-20 20:10:14,893 step [ 336], lr [0.0001500], embedding loss [ 0.8204], quantization loss [ 0.0237], 1.49 sec/batch.
2022-10-20 20:10:17,619 step [ 337], lr [0.0001500], embedding loss [ 0.8236], quantization loss [ 0.0227], 1.41 sec/batch.
2022-10-20 20:10:20,398 step [ 338], lr [0.0001500], embedding loss [ 0.8120], quantization loss [ 0.0232], 1.48 sec/batch.
2022-10-20 20:10:23,166 step [ 339], lr [0.0001500], embedding loss [ 0.8203], quantization loss [ 0.0213], 1.44 sec/batch.
2022-10-20 20:10:25,958 step [ 340], lr [0.0001500], embedding loss [ 0.8216], quantization loss [ 0.0250], 1.48 sec/batch.
2022-10-20 20:10:28,830 step [ 341], lr [0.0001500], embedding loss [ 0.8166], quantization loss [ 0.0229], 1.54 sec/batch.
2022-10-20 20:10:31,594 step [ 342], lr [0.0001500], embedding loss [ 0.8211], quantization loss [ 0.0217], 1.44 sec/batch.
2022-10-20 20:10:34,359 step [ 343], lr [0.0001500], embedding loss [ 0.8187], quantization loss [ 0.0235], 1.44 sec/batch.
2022-10-20 20:10:37,187 step [ 344], lr [0.0001500], embedding loss [ 0.8156], quantization loss [ 0.0217], 1.49 sec/batch.
2022-10-20 20:10:39,971 step [ 345], lr [0.0001500], embedding loss [ 0.8195], quantization loss [ 0.0230], 1.44 sec/batch.
2022-10-20 20:10:42,735 step [ 346], lr [0.0001500], embedding loss [ 0.8243], quantization loss [ 0.0214], 1.44 sec/batch.
2022-10-20 20:10:45,497 step [ 347], lr [0.0001500], embedding loss [ 0.8213], quantization loss [ 0.0205], 1.45 sec/batch.
2022-10-20 20:10:48,273 step [ 348], lr [0.0001500], embedding loss [ 0.8276], quantization loss [ 0.0204], 1.46 sec/batch.
2022-10-20 20:10:50,783 step [ 349], lr [0.0001500], embedding loss [ 0.8242], quantization loss [ 0.0225], 1.19 sec/batch.
2022-10-20 20:10:53,588 step [ 350], lr [0.0001500], embedding loss [ 0.8164], quantization loss [ 0.0232], 1.49 sec/batch.
2022-10-20 20:10:56,417 step [ 351], lr [0.0001500], embedding loss [ 0.8187], quantization loss [ 0.0220], 1.47 sec/batch.
2022-10-20 20:10:59,232 step [ 352], lr [0.0001500], embedding loss [ 0.8135], quantization loss [ 0.0236], 1.51 sec/batch.
2022-10-20 20:11:02,027 step [ 353], lr [0.0001500], embedding loss [ 0.8219], quantization loss [ 0.0209], 1.47 sec/batch.
2022-10-20 20:11:04,847 step [ 354], lr [0.0001500], embedding loss [ 0.8203], quantization loss [ 0.0217], 1.50 sec/batch.
2022-10-20 20:11:07,550 step [ 355], lr [0.0001500], embedding loss [ 0.8203], quantization loss [ 0.0240], 1.40 sec/batch.
2022-10-20 20:11:10,369 step [ 356], lr [0.0001500], embedding loss [ 0.8164], quantization loss [ 0.0224], 1.51 sec/batch.
2022-10-20 20:11:13,167 step [ 357], lr [0.0001500], embedding loss [ 0.8250], quantization loss [ 0.0227], 1.46 sec/batch.
2022-10-20 20:11:15,995 step [ 358], lr [0.0001500], embedding loss [ 0.8202], quantization loss [ 0.0219], 1.52 sec/batch.
2022-10-20 20:11:19,255 step [ 359], lr [0.0001500], embedding loss [ 0.8164], quantization loss [ 0.0209], 1.87 sec/batch.
2022-10-20 20:11:22,103 step [ 360], lr [0.0001500], embedding loss [ 0.8219], quantization loss [ 0.0217], 1.53 sec/batch.
2022-10-20 20:11:24,923 step [ 361], lr [0.0001500], embedding loss [ 0.8241], quantization loss [ 0.0203], 1.47 sec/batch.
2022-10-20 20:11:27,791 step [ 362], lr [0.0001500], embedding loss [ 0.8222], quantization loss [ 0.0219], 1.53 sec/batch.
2022-10-20 20:11:30,651 step [ 363], lr [0.0001500], embedding loss [ 0.8132], quantization loss [ 0.0214], 1.48 sec/batch.
2022-10-20 20:11:33,522 step [ 364], lr [0.0001500], embedding loss [ 0.8311], quantization loss [ 0.0206], 1.49 sec/batch.
2022-10-20 20:11:36,289 step [ 365], lr [0.0001500], embedding loss [ 0.8178], quantization loss [ 0.0236], 1.46 sec/batch.
2022-10-20 20:11:39,104 step [ 366], lr [0.0001500], embedding loss [ 0.8165], quantization loss [ 0.0229], 1.50 sec/batch.
2022-10-20 20:11:41,872 step [ 367], lr [0.0001500], embedding loss [ 0.8201], quantization loss [ 0.0227], 1.45 sec/batch.
2022-10-20 20:11:44,695 step [ 368], lr [0.0001500], embedding loss [ 0.8236], quantization loss [ 0.0213], 1.51 sec/batch.
2022-10-20 20:11:47,515 step [ 369], lr [0.0001500], embedding loss [ 0.8182], quantization loss [ 0.0204], 1.47 sec/batch.
2022-10-20 20:11:50,312 step [ 370], lr [0.0001500], embedding loss [ 0.8262], quantization loss [ 0.0222], 1.49 sec/batch.
2022-10-20 20:11:53,146 step [ 371], lr [0.0001500], embedding loss [ 0.8228], quantization loss [ 0.0235], 1.49 sec/batch.
2022-10-20 20:11:55,998 step [ 372], lr [0.0001500], embedding loss [ 0.8191], quantization loss [ 0.0220], 1.51 sec/batch.
2022-10-20 20:11:58,804 step [ 373], lr [0.0001500], embedding loss [ 0.8166], quantization loss [ 0.0200], 1.47 sec/batch.
2022-10-20 20:12:01,645 step [ 374], lr [0.0001500], embedding loss [ 0.8171], quantization loss [ 0.0205], 1.52 sec/batch.
2022-10-20 20:12:04,458 step [ 375], lr [0.0001500], embedding loss [ 0.8187], quantization loss [ 0.0223], 1.46 sec/batch.
2022-10-20 20:12:07,289 step [ 376], lr [0.0001500], embedding loss [ 0.8259], quantization loss [ 0.0230], 1.51 sec/batch.
2022-10-20 20:12:10,150 step [ 377], lr [0.0001500], embedding loss [ 0.8168], quantization loss [ 0.0217], 1.48 sec/batch.
2022-10-20 20:12:13,055 step [ 378], lr [0.0001500], embedding loss [ 0.8194], quantization loss [ 0.0207], 1.52 sec/batch.
2022-10-20 20:12:15,910 step [ 379], lr [0.0001500], embedding loss [ 0.8292], quantization loss [ 0.0233], 1.48 sec/batch.
2022-10-20 20:12:18,781 step [ 380], lr [0.0001500], embedding loss [ 0.8274], quantization loss [ 0.0223], 1.50 sec/batch.
2022-10-20 20:12:21,606 step [ 381], lr [0.0001500], embedding loss [ 0.8182], quantization loss [ 0.0216], 1.48 sec/batch.
2022-10-20 20:12:24,431 step [ 382], lr [0.0001500], embedding loss [ 0.8247], quantization loss [ 0.0204], 1.50 sec/batch.
2022-10-20 20:12:27,244 step [ 383], lr [0.0001500], embedding loss [ 0.8212], quantization loss [ 0.0213], 1.47 sec/batch.
2022-10-20 20:12:30,067 step [ 384], lr [0.0001500], embedding loss [ 0.8154], quantization loss [ 0.0215], 1.49 sec/batch.
2022-10-20 20:12:32,874 step [ 385], lr [0.0001500], embedding loss [ 0.8173], quantization loss [ 0.0223], 1.47 sec/batch.
2022-10-20 20:12:35,753 step [ 386], lr [0.0001500], embedding loss [ 0.8169], quantization loss [ 0.0214], 1.47 sec/batch.
2022-10-20 20:12:38,590 step [ 387], lr [0.0001500], embedding loss [ 0.8187], quantization loss [ 0.0221], 1.49 sec/batch.
2022-10-20 20:12:41,401 step [ 388], lr [0.0001500], embedding loss [ 0.8223], quantization loss [ 0.0212], 1.46 sec/batch.
2022-10-20 20:12:44,156 step [ 389], lr [0.0001500], embedding loss [ 0.8210], quantization loss [ 0.0202], 1.43 sec/batch.
2022-10-20 20:12:46,969 step [ 390], lr [0.0001500], embedding loss [ 0.8221], quantization loss [ 0.0215], 1.50 sec/batch.
2022-10-20 20:12:49,742 step [ 391], lr [0.0001500], embedding loss [ 0.8299], quantization loss [ 0.0221], 1.43 sec/batch.
2022-10-20 20:12:52,527 step [ 392], lr [0.0001500], embedding loss [ 0.8271], quantization loss [ 0.0231], 1.47 sec/batch.
2022-10-20 20:12:55,309 step [ 393], lr [0.0001500], embedding loss [ 0.8185], quantization loss [ 0.0217], 1.44 sec/batch.
2022-10-20 20:12:58,362 step [ 394], lr [0.0001500], embedding loss [ 0.8234], quantization loss [ 0.0197], 1.47 sec/batch.
2022-10-20 20:13:01,120 step [ 395], lr [0.0001500], embedding loss [ 0.8180], quantization loss [ 0.0205], 1.42 sec/batch.
2022-10-20 20:13:03,922 step [ 396], lr [0.0001500], embedding loss [ 0.8169], quantization loss [ 0.0206], 1.48 sec/batch.
2022-10-20 20:13:06,726 step [ 397], lr [0.0001500], embedding loss [ 0.8176], quantization loss [ 0.0231], 1.43 sec/batch.
2022-10-20 20:13:09,486 step [ 398], lr [0.0001500], embedding loss [ 0.8186], quantization loss [ 0.0215], 1.44 sec/batch.
2022-10-20 20:13:12,231 step [ 399], lr [0.0001500], embedding loss [ 0.8190], quantization loss [ 0.0202], 1.40 sec/batch.
2022-10-20 20:13:15,032 step [ 400], lr [0.0001500], embedding loss [ 0.8172], quantization loss [ 0.0220], 1.48 sec/batch.
2022-10-20 20:13:17,837 step [ 401], lr [0.0001500], embedding loss [ 0.8121], quantization loss [ 0.0215], 1.47 sec/batch.
2022-10-20 20:13:20,756 step [ 402], lr [0.0001500], embedding loss [ 0.8211], quantization loss [ 0.0209], 1.57 sec/batch.
2022-10-20 20:13:23,454 step [ 403], lr [0.0001500], embedding loss [ 0.8219], quantization loss [ 0.0223], 1.41 sec/batch.
2022-10-20 20:13:26,331 step [ 404], lr [0.0001500], embedding loss [ 0.8198], quantization loss [ 0.0212], 1.52 sec/batch.
2022-10-20 20:13:29,099 step [ 405], lr [0.0001500], embedding loss [ 0.8242], quantization loss [ 0.0206], 1.41 sec/batch.
2022-10-20 20:13:31,946 step [ 406], lr [0.0001500], embedding loss [ 0.8253], quantization loss [ 0.0209], 1.48 sec/batch.
2022-10-20 20:13:34,702 step [ 407], lr [0.0001500], embedding loss [ 0.8132], quantization loss [ 0.0208], 1.44 sec/batch.
2022-10-20 20:13:37,468 step [ 408], lr [0.0001500], embedding loss [ 0.8227], quantization loss [ 0.0197], 1.46 sec/batch.
2022-10-20 20:13:40,194 step [ 409], lr [0.0001500], embedding loss [ 0.8232], quantization loss [ 0.0211], 1.41 sec/batch.
2022-10-20 20:13:42,982 step [ 410], lr [0.0001500], embedding loss [ 0.8216], quantization loss [ 0.0226], 1.46 sec/batch.
2022-10-20 20:13:46,053 step [ 411], lr [0.0001500], embedding loss [ 0.8137], quantization loss [ 0.0199], 1.61 sec/batch.
2022-10-20 20:13:48,851 step [ 412], lr [0.0001500], embedding loss [ 0.8206], quantization loss [ 0.0211], 1.47 sec/batch.
2022-10-20 20:13:51,411 step [ 413], lr [0.0001500], embedding loss [ 0.8180], quantization loss [ 0.0216], 1.22 sec/batch.
2022-10-20 20:13:54,240 step [ 414], lr [0.0001500], embedding loss [ 0.8203], quantization loss [ 0.0202], 1.50 sec/batch.
2022-10-20 20:13:57,086 step [ 415], lr [0.0001500], embedding loss [ 0.8196], quantization loss [ 0.0197], 1.50 sec/batch.
2022-10-20 20:13:59,962 step [ 416], lr [0.0001500], embedding loss [ 0.8193], quantization loss [ 0.0215], 1.53 sec/batch.
2022-10-20 20:14:02,808 step [ 417], lr [0.0001500], embedding loss [ 0.8236], quantization loss [ 0.0192], 1.47 sec/batch.
2022-10-20 20:14:05,696 step [ 418], lr [0.0001500], embedding loss [ 0.8222], quantization loss [ 0.0199], 1.52 sec/batch.
2022-10-20 20:14:08,698 step [ 419], lr [0.0001500], embedding loss [ 0.8180], quantization loss [ 0.0204], 1.64 sec/batch.
2022-10-20 20:14:11,537 step [ 420], lr [0.0001500], embedding loss [ 0.8219], quantization loss [ 0.0214], 1.52 sec/batch.
2022-10-20 20:14:14,345 step [ 421], lr [0.0001500], embedding loss [ 0.8185], quantization loss [ 0.0207], 1.47 sec/batch.
2022-10-20 20:14:17,209 step [ 422], lr [0.0001500], embedding loss [ 0.8192], quantization loss [ 0.0234], 1.53 sec/batch.
2022-10-20 20:14:20,041 step [ 423], lr [0.0001500], embedding loss [ 0.8180], quantization loss [ 0.0196], 1.45 sec/batch.
2022-10-20 20:14:22,866 step [ 424], lr [0.0001500], embedding loss [ 0.8270], quantization loss [ 0.0202], 1.46 sec/batch.
2022-10-20 20:14:25,654 step [ 425], lr [0.0001500], embedding loss [ 0.8203], quantization loss [ 0.0204], 1.46 sec/batch.
2022-10-20 20:14:28,475 step [ 426], lr [0.0001500], embedding loss [ 0.8141], quantization loss [ 0.0219], 1.49 sec/batch.
2022-10-20 20:14:31,301 step [ 427], lr [0.0001500], embedding loss [ 0.8228], quantization loss [ 0.0216], 1.47 sec/batch.
2022-10-20 20:14:34,097 step [ 428], lr [0.0001500], embedding loss [ 0.8274], quantization loss [ 0.0215], 1.46 sec/batch.
2022-10-20 20:14:36,898 step [ 429], lr [0.0001500], embedding loss [ 0.8174], quantization loss [ 0.0209], 1.47 sec/batch.
2022-10-20 20:14:39,698 step [ 430], lr [0.0001500], embedding loss [ 0.8273], quantization loss [ 0.0206], 1.47 sec/batch.
2022-10-20 20:14:42,502 step [ 431], lr [0.0001500], embedding loss [ 0.8187], quantization loss [ 0.0207], 1.48 sec/batch.
2022-10-20 20:14:45,357 step [ 432], lr [0.0001500], embedding loss [ 0.8093], quantization loss [ 0.0209], 1.49 sec/batch.
2022-10-20 20:14:48,173 step [ 433], lr [0.0001500], embedding loss [ 0.8192], quantization loss [ 0.0200], 1.46 sec/batch.
2022-10-20 20:14:51,039 step [ 434], lr [0.0001500], embedding loss [ 0.8168], quantization loss [ 0.0219], 1.53 sec/batch.
2022-10-20 20:14:53,886 step [ 435], lr [0.0001500], embedding loss [ 0.8267], quantization loss [ 0.0212], 1.48 sec/batch.
2022-10-20 20:14:56,819 step [ 436], lr [0.0001500], embedding loss [ 0.8177], quantization loss [ 0.0206], 1.57 sec/batch.
2022-10-20 20:14:59,604 step [ 437], lr [0.0001500], embedding loss [ 0.8107], quantization loss [ 0.0199], 1.44 sec/batch.
2022-10-20 20:15:02,413 step [ 438], lr [0.0001500], embedding loss [ 0.8266], quantization loss [ 0.0192], 1.49 sec/batch.
2022-10-20 20:15:05,220 step [ 439], lr [0.0001500], embedding loss [ 0.8132], quantization loss [ 0.0198], 1.46 sec/batch.
2022-10-20 20:15:05,220 update codes and centers iter(1/1).
2022-10-20 20:15:09,137 number of update_code wrong: 0.
2022-10-20 20:15:11,898 non zero codewords: 512.
2022-10-20 20:15:11,898 finish center update, duration: 6.68 sec.
2022-10-20 20:15:14,854 step [ 440], lr [0.0001500], embedding loss [ 0.8258], quantization loss [ 0.0143], 1.52 sec/batch.
2022-10-20 20:15:17,704 step [ 441], lr [0.0001500], embedding loss [ 0.8247], quantization loss [ 0.0147], 1.47 sec/batch.
2022-10-20 20:15:20,531 step [ 442], lr [0.0001500], embedding loss [ 0.8157], quantization loss [ 0.0146], 1.49 sec/batch.
2022-10-20 20:15:23,339 step [ 443], lr [0.0001500], embedding loss [ 0.8137], quantization loss [ 0.0134], 1.46 sec/batch.
2022-10-20 20:15:26,268 step [ 444], lr [0.0001500], embedding loss [ 0.8233], quantization loss [ 0.0145], 1.52 sec/batch.
2022-10-20 20:15:29,081 step [ 445], lr [0.0001500], embedding loss [ 0.8289], quantization loss [ 0.0141], 1.45 sec/batch.
2022-10-20 20:15:32,006 step [ 446], lr [0.0001500], embedding loss [ 0.8182], quantization loss [ 0.0153], 1.58 sec/batch.
2022-10-20 20:15:34,843 step [ 447], lr [0.0001500], embedding loss [ 0.8214], quantization loss [ 0.0153], 1.49 sec/batch.
2022-10-20 20:15:37,699 step [ 448], lr [0.0001500], embedding loss [ 0.8096], quantization loss [ 0.0132], 1.49 sec/batch.
2022-10-20 20:15:40,598 step [ 449], lr [0.0001500], embedding loss [ 0.8137], quantization loss [ 0.0143], 1.50 sec/batch.
2022-10-20 20:15:43,452 step [ 450], lr [0.0001500], embedding loss [ 0.8198], quantization loss [ 0.0134], 1.48 sec/batch.
2022-10-20 20:15:46,251 step [ 451], lr [0.0001500], embedding loss [ 0.8317], quantization loss [ 0.0145], 1.45 sec/batch.
2022-10-20 20:15:49,101 step [ 452], lr [0.0001500], embedding loss [ 0.8152], quantization loss [ 0.0148], 1.48 sec/batch.
2022-10-20 20:15:51,890 step [ 453], lr [0.0001500], embedding loss [ 0.8242], quantization loss [ 0.0141], 1.41 sec/batch.
2022-10-20 20:15:54,799 step [ 454], lr [0.0001500], embedding loss [ 0.8215], quantization loss [ 0.0148], 1.50 sec/batch.
2022-10-20 20:15:57,612 step [ 455], lr [0.0001500], embedding loss [ 0.8075], quantization loss [ 0.0133], 1.46 sec/batch.
2022-10-20 20:16:00,413 step [ 456], lr [0.0001500], embedding loss [ 0.8174], quantization loss [ 0.0145], 1.46 sec/batch.
2022-10-20 20:16:03,222 step [ 457], lr [0.0001500], embedding loss [ 0.8199], quantization loss [ 0.0151], 1.47 sec/batch.
2022-10-20 20:16:06,080 step [ 458], lr [0.0001500], embedding loss [ 0.8157], quantization loss [ 0.0139], 1.47 sec/batch.
2022-10-20 20:16:08,854 step [ 459], lr [0.0001500], embedding loss [ 0.8179], quantization loss [ 0.0142], 1.43 sec/batch.
2022-10-20 20:16:11,723 step [ 460], lr [0.0001500], embedding loss [ 0.8174], quantization loss [ 0.0138], 1.49 sec/batch.
2022-10-20 20:16:14,518 step [ 461], lr [0.0001500], embedding loss [ 0.8205], quantization loss [ 0.0129], 1.45 sec/batch.
2022-10-20 20:16:17,327 step [ 462], lr [0.0001500], embedding loss [ 0.8174], quantization loss [ 0.0139], 1.47 sec/batch.
2022-10-20 20:16:20,140 step [ 463], lr [0.0001500], embedding loss [ 0.8323], quantization loss [ 0.0135], 1.46 sec/batch.
2022-10-20 20:16:23,063 step [ 464], lr [0.0001500], embedding loss [ 0.8255], quantization loss [ 0.0135], 1.53 sec/batch.
2022-10-20 20:16:26,043 step [ 465], lr [0.0001500], embedding loss [ 0.8212], quantization loss [ 0.0137], 1.58 sec/batch.
2022-10-20 20:16:28,976 step [ 466], lr [0.0001500], embedding loss [ 0.8138], quantization loss [ 0.0136], 1.53 sec/batch.
2022-10-20 20:16:31,842 step [ 467], lr [0.0001500], embedding loss [ 0.8207], quantization loss [ 0.0144], 1.49 sec/batch.
2022-10-20 20:16:34,729 step [ 468], lr [0.0001500], embedding loss [ 0.8165], quantization loss [ 0.0137], 1.50 sec/batch.
2022-10-20 20:16:37,522 step [ 469], lr [0.0001500], embedding loss [ 0.8113], quantization loss [ 0.0144], 1.43 sec/batch.
2022-10-20 20:16:40,345 step [ 470], lr [0.0001500], embedding loss [ 0.8331], quantization loss [ 0.0129], 1.48 sec/batch.
2022-10-20 20:16:43,209 step [ 471], lr [0.0001500], embedding loss [ 0.8264], quantization loss [ 0.0140], 1.48 sec/batch.
2022-10-20 20:16:46,060 step [ 472], lr [0.0001500], embedding loss [ 0.8238], quantization loss [ 0.0144], 1.49 sec/batch.
2022-10-20 20:16:48,934 step [ 473], lr [0.0001500], embedding loss [ 0.8229], quantization loss [ 0.0140], 1.46 sec/batch.
2022-10-20 20:16:51,757 step [ 474], lr [0.0001500], embedding loss [ 0.8229], quantization loss [ 0.0134], 1.47 sec/batch.
2022-10-20 20:16:54,603 step [ 475], lr [0.0001500], embedding loss [ 0.8229], quantization loss [ 0.0146], 1.44 sec/batch.
2022-10-20 20:16:57,437 step [ 476], lr [0.0001500], embedding loss [ 0.8214], quantization loss [ 0.0126], 1.48 sec/batch.
2022-10-20 20:17:00,023 step [ 477], lr [0.0001500], embedding loss [ 0.8217], quantization loss [ 0.0132], 1.20 sec/batch.
2022-10-20 20:17:02,881 step [ 478], lr [0.0001500], embedding loss [ 0.8295], quantization loss [ 0.0130], 1.49 sec/batch.
2022-10-20 20:17:05,445 step [ 479], lr [0.0001500], embedding loss [ 0.8272], quantization loss [ 0.0133], 1.21 sec/batch.
2022-10-20 20:17:07,894 step [ 480], lr [0.0001500], embedding loss [ 0.8278], quantization loss [ 0.0141], 1.11 sec/batch.
2022-10-20 20:17:10,296 step [ 481], lr [0.0001500], embedding loss [ 0.8122], quantization loss [ 0.0143], 1.04 sec/batch.
2022-10-20 20:17:12,904 step [ 482], lr [0.0001500], embedding loss [ 0.8226], quantization loss [ 0.0129], 1.26 sec/batch.
2022-10-20 20:17:15,418 step [ 483], lr [0.0001500], embedding loss [ 0.8213], quantization loss [ 0.0152], 1.04 sec/batch.
2022-10-20 20:17:17,834 step [ 484], lr [0.0001500], embedding loss [ 0.8199], quantization loss [ 0.0142], 1.09 sec/batch.
2022-10-20 20:17:20,225 step [ 485], lr [0.0001500], embedding loss [ 0.8157], quantization loss [ 0.0138], 1.00 sec/batch.
2022-10-20 20:17:22,776 step [ 486], lr [0.0001500], embedding loss [ 0.8168], quantization loss [ 0.0131], 1.22 sec/batch.
2022-10-20 20:17:25,243 step [ 487], lr [0.0001500], embedding loss [ 0.8240], quantization loss [ 0.0140], 1.08 sec/batch.
2022-10-20 20:17:27,673 step [ 488], lr [0.0001500], embedding loss [ 0.8173], quantization loss [ 0.0129], 1.08 sec/batch.
2022-10-20 20:17:30,087 step [ 489], lr [0.0001500], embedding loss [ 0.8228], quantization loss [ 0.0132], 1.06 sec/batch.
2022-10-20 20:17:32,682 step [ 490], lr [0.0001500], embedding loss [ 0.8097], quantization loss [ 0.0129], 1.25 sec/batch.
2022-10-20 20:17:35,144 step [ 491], lr [0.0001500], embedding loss [ 0.8336], quantization loss [ 0.0146], 1.08 sec/batch.
2022-10-20 20:17:37,989 step [ 492], lr [0.0001500], embedding loss [ 0.8225], quantization loss [ 0.0139], 1.39 sec/batch.
2022-10-20 20:17:41,398 step [ 493], lr [0.0001500], embedding loss [ 0.8208], quantization loss [ 0.0135], 2.07 sec/batch.
2022-10-20 20:17:44,308 step [ 494], lr [0.0001500], embedding loss [ 0.8241], quantization loss [ 0.0145], 1.55 sec/batch.
2022-10-20 20:17:47,237 step [ 495], lr [0.0001500], embedding loss [ 0.8292], quantization loss [ 0.0136], 1.53 sec/batch.
2022-10-20 20:17:50,107 step [ 496], lr [0.0001500], embedding loss [ 0.8144], quantization loss [ 0.0140], 1.53 sec/batch.
2022-10-20 20:17:52,954 step [ 497], lr [0.0001500], embedding loss [ 0.8209], quantization loss [ 0.0134], 1.50 sec/batch.
2022-10-20 20:17:55,818 step [ 498], lr [0.0001500], embedding loss [ 0.8184], quantization loss [ 0.0158], 1.52 sec/batch.
2022-10-20 20:17:58,674 step [ 499], lr [0.0001500], embedding loss [ 0.8320], quantization loss [ 0.0128], 1.44 sec/batch.
2022-10-20 20:18:01,539 step [ 500], lr [0.0001500], embedding loss [ 0.8149], quantization loss [ 0.0141], 1.53 sec/batch.
2022-10-20 20:18:04,319 step [ 501], lr [0.0001500], embedding loss [ 0.8129], quantization loss [ 0.0135], 1.45 sec/batch.
2022-10-20 20:18:07,183 step [ 502], lr [0.0001500], embedding loss [ 0.8206], quantization loss [ 0.0139], 1.53 sec/batch.
2022-10-20 20:18:10,031 step [ 503], lr [0.0001500], embedding loss [ 0.8161], quantization loss [ 0.0142], 1.50 sec/batch.
2022-10-20 20:18:12,900 step [ 504], lr [0.0001500], embedding loss [ 0.8193], quantization loss [ 0.0128], 1.53 sec/batch.
2022-10-20 20:18:15,786 step [ 505], lr [0.0001500], embedding loss [ 0.8186], quantization loss [ 0.0137], 1.54 sec/batch.
2022-10-20 20:18:18,683 step [ 506], lr [0.0001500], embedding loss [ 0.8118], quantization loss [ 0.0138], 1.54 sec/batch.
2022-10-20 20:18:21,584 step [ 507], lr [0.0001500], embedding loss [ 0.8187], quantization loss [ 0.0145], 1.51 sec/batch.
2022-10-20 20:18:24,454 step [ 508], lr [0.0001500], embedding loss [ 0.8144], quantization loss [ 0.0131], 1.52 sec/batch.
2022-10-20 20:18:27,330 step [ 509], lr [0.0001500], embedding loss [ 0.8117], quantization loss [ 0.0138], 1.53 sec/batch.
2022-10-20 20:18:30,195 step [ 510], lr [0.0001500], embedding loss [ 0.8242], quantization loss [ 0.0147], 1.52 sec/batch.
2022-10-20 20:18:33,059 step [ 511], lr [0.0001500], embedding loss [ 0.8258], quantization loss [ 0.0148], 1.50 sec/batch.
2022-10-20 20:18:35,944 step [ 512], lr [0.0001500], embedding loss [ 0.8190], quantization loss [ 0.0139], 1.52 sec/batch.
2022-10-20 20:18:38,768 step [ 513], lr [0.0001500], embedding loss [ 0.8305], quantization loss [ 0.0132], 1.47 sec/batch.
2022-10-20 20:18:41,591 step [ 514], lr [0.0001500], embedding loss [ 0.8168], quantization loss [ 0.0139], 1.50 sec/batch.
2022-10-20 20:18:44,421 step [ 515], lr [0.0001500], embedding loss [ 0.8211], quantization loss [ 0.0129], 1.48 sec/batch.
2022-10-20 20:18:47,287 step [ 516], lr [0.0001500], embedding loss [ 0.8203], quantization loss [ 0.0141], 1.51 sec/batch.
2022-10-20 20:18:50,163 step [ 517], lr [0.0001500], embedding loss [ 0.8235], quantization loss [ 0.0128], 1.51 sec/batch.
2022-10-20 20:18:53,162 step [ 518], lr [0.0001500], embedding loss [ 0.8174], quantization loss [ 0.0141], 1.56 sec/batch.
2022-10-20 20:18:55,801 step [ 519], lr [0.0001500], embedding loss [ 0.8242], quantization loss [ 0.0127], 1.26 sec/batch.
2022-10-20 20:18:58,692 step [ 520], lr [0.0001500], embedding loss [ 0.8197], quantization loss [ 0.0132], 1.54 sec/batch.
2022-10-20 20:19:01,630 step [ 521], lr [0.0001500], embedding loss [ 0.8126], quantization loss [ 0.0137], 1.57 sec/batch.
2022-10-20 20:19:04,591 step [ 522], lr [0.0001500], embedding loss [ 0.8219], quantization loss [ 0.0136], 1.60 sec/batch.
2022-10-20 20:19:07,532 step [ 523], lr [0.0001500], embedding loss [ 0.8133], quantization loss [ 0.0136], 1.56 sec/batch.
2022-10-20 20:19:10,454 step [ 524], lr [0.0001500], embedding loss [ 0.8207], quantization loss [ 0.0132], 1.57 sec/batch.
2022-10-20 20:19:13,326 step [ 525], lr [0.0001500], embedding loss [ 0.8301], quantization loss [ 0.0130], 1.52 sec/batch.
2022-10-20 20:19:16,268 step [ 526], lr [0.0001500], embedding loss [ 0.8246], quantization loss [ 0.0151], 1.58 sec/batch.
2022-10-20 20:19:19,196 step [ 527], lr [0.0001500], embedding loss [ 0.8187], quantization loss [ 0.0143], 1.53 sec/batch.
2022-10-20 20:19:22,130 step [ 528], lr [0.0001500], embedding loss [ 0.8180], quantization loss [ 0.0135], 1.57 sec/batch.
2022-10-20 20:19:25,023 step [ 529], lr [0.0001500], embedding loss [ 0.8218], quantization loss [ 0.0139], 1.52 sec/batch.
2022-10-20 20:19:27,955 step [ 530], lr [0.0001500], embedding loss [ 0.8178], quantization loss [ 0.0139], 1.55 sec/batch.
2022-10-20 20:19:30,871 step [ 531], lr [0.0001500], embedding loss [ 0.8267], quantization loss [ 0.0135], 1.51 sec/batch.
2022-10-20 20:19:33,819 step [ 532], lr [0.0001500], embedding loss [ 0.8166], quantization loss [ 0.0132], 1.53 sec/batch.
2022-10-20 20:19:36,692 step [ 533], lr [0.0001500], embedding loss [ 0.8173], quantization loss [ 0.0135], 1.47 sec/batch.
2022-10-20 20:19:39,510 step [ 534], lr [0.0001500], embedding loss [ 0.8295], quantization loss [ 0.0130], 1.48 sec/batch.
2022-10-20 20:19:42,441 step [ 535], lr [0.0001500], embedding loss [ 0.8181], quantization loss [ 0.0132], 1.51 sec/batch.
2022-10-20 20:19:45,374 step [ 536], lr [0.0001500], embedding loss [ 0.8173], quantization loss [ 0.0134], 1.56 sec/batch.
2022-10-20 20:19:48,289 step [ 537], lr [0.0001500], embedding loss [ 0.8121], quantization loss [ 0.0136], 1.51 sec/batch.
2022-10-20 20:19:51,160 step [ 538], lr [0.0001500], embedding loss [ 0.8230], quantization loss [ 0.0134], 1.51 sec/batch.
2022-10-20 20:19:54,048 step [ 539], lr [0.0001500], embedding loss [ 0.8202], quantization loss [ 0.0127], 1.50 sec/batch.
2022-10-20 20:19:56,973 step [ 540], lr [0.0001500], embedding loss [ 0.8146], quantization loss [ 0.0129], 1.54 sec/batch.
2022-10-20 20:19:59,858 step [ 541], lr [0.0001500], embedding loss [ 0.8209], quantization loss [ 0.0132], 1.50 sec/batch.
2022-10-20 20:20:02,730 step [ 542], lr [0.0001500], embedding loss [ 0.8359], quantization loss [ 0.0122], 1.50 sec/batch.
2022-10-20 20:20:05,592 step [ 543], lr [0.0001500], embedding loss [ 0.8172], quantization loss [ 0.0133], 1.45 sec/batch.
2022-10-20 20:20:08,428 step [ 544], lr [0.0001500], embedding loss [ 0.8227], quantization loss [ 0.0127], 1.49 sec/batch.
2022-10-20 20:20:11,269 step [ 545], lr [0.0001500], embedding loss [ 0.8150], quantization loss [ 0.0138], 1.46 sec/batch.
2022-10-20 20:20:14,130 step [ 546], lr [0.0001500], embedding loss [ 0.8209], quantization loss [ 0.0133], 1.52 sec/batch.
2022-10-20 20:20:17,040 step [ 547], lr [0.0001500], embedding loss [ 0.8256], quantization loss [ 0.0133], 1.51 sec/batch.
2022-10-20 20:20:19,891 step [ 548], lr [0.0001500], embedding loss [ 0.8274], quantization loss [ 0.0131], 1.51 sec/batch.
2022-10-20 20:20:23,035 step [ 549], lr [0.0001500], embedding loss [ 0.8186], quantization loss [ 0.0128], 1.73 sec/batch.
2022-10-20 20:20:25,944 step [ 550], lr [0.0001500], embedding loss [ 0.8172], quantization loss [ 0.0131], 1.54 sec/batch.
2022-10-20 20:20:28,537 step [ 551], lr [0.0001500], embedding loss [ 0.8149], quantization loss [ 0.0135], 1.23 sec/batch.
2022-10-20 20:20:31,437 step [ 552], lr [0.0001500], embedding loss [ 0.8188], quantization loss [ 0.0128], 1.51 sec/batch.
2022-10-20 20:20:34,341 step [ 553], lr [0.0001500], embedding loss [ 0.8235], quantization loss [ 0.0133], 1.55 sec/batch.
2022-10-20 20:20:37,359 step [ 554], lr [0.0001500], embedding loss [ 0.8174], quantization loss [ 0.0128], 1.54 sec/batch.
2022-10-20 20:20:40,258 step [ 555], lr [0.0001500], embedding loss [ 0.8219], quantization loss [ 0.0132], 1.52 sec/batch.
2022-10-20 20:20:43,178 step [ 556], lr [0.0001500], embedding loss [ 0.8241], quantization loss [ 0.0127], 1.56 sec/batch.
2022-10-20 20:20:46,095 step [ 557], lr [0.0001500], embedding loss [ 0.8150], quantization loss [ 0.0129], 1.54 sec/batch.
2022-10-20 20:20:49,032 step [ 558], lr [0.0001500], embedding loss [ 0.8286], quantization loss [ 0.0131], 1.56 sec/batch.
2022-10-20 20:20:51,932 step [ 559], lr [0.0001500], embedding loss [ 0.8221], quantization loss [ 0.0137], 1.52 sec/batch.
2022-10-20 20:20:54,857 step [ 560], lr [0.0001500], embedding loss [ 0.8184], quantization loss [ 0.0143], 1.55 sec/batch.
2022-10-20 20:20:57,781 step [ 561], lr [0.0001500], embedding loss [ 0.8194], quantization loss [ 0.0120], 1.53 sec/batch.
2022-10-20 20:21:00,742 step [ 562], lr [0.0001500], embedding loss [ 0.8149], quantization loss [ 0.0133], 1.57 sec/batch.
2022-10-20 20:21:03,698 step [ 563], lr [0.0001500], embedding loss [ 0.8254], quantization loss [ 0.0135], 1.55 sec/batch.
2022-10-20 20:21:06,670 step [ 564], lr [0.0001500], embedding loss [ 0.8190], quantization loss [ 0.0128], 1.58 sec/batch.
2022-10-20 20:21:09,577 step [ 565], lr [0.0001500], embedding loss [ 0.8129], quantization loss [ 0.0137], 1.50 sec/batch.
2022-10-20 20:21:12,459 step [ 566], lr [0.0001500], embedding loss [ 0.8138], quantization loss [ 0.0130], 1.52 sec/batch.
2022-10-20 20:21:15,378 step [ 567], lr [0.0001500], embedding loss [ 0.8290], quantization loss [ 0.0127], 1.53 sec/batch.
2022-10-20 20:21:18,435 step [ 568], lr [0.0001500], embedding loss [ 0.8309], quantization loss [ 0.0126], 1.66 sec/batch.
2022-10-20 20:21:21,334 step [ 569], lr [0.0001500], embedding loss [ 0.8285], quantization loss [ 0.0126], 1.51 sec/batch.
2022-10-20 20:21:24,303 step [ 570], lr [0.0001500], embedding loss [ 0.8079], quantization loss [ 0.0139], 1.54 sec/batch.
2022-10-20 20:21:27,177 step [ 571], lr [0.0001500], embedding loss [ 0.8167], quantization loss [ 0.0121], 1.48 sec/batch.
2022-10-20 20:21:30,105 step [ 572], lr [0.0001500], embedding loss [ 0.8184], quantization loss [ 0.0131], 1.54 sec/batch.
2022-10-20 20:21:33,088 step [ 573], lr [0.0001500], embedding loss [ 0.8187], quantization loss [ 0.0129], 1.56 sec/batch.
2022-10-20 20:21:35,981 step [ 574], lr [0.0001500], embedding loss [ 0.8118], quantization loss [ 0.0130], 1.53 sec/batch.
2022-10-20 20:21:38,787 step [ 575], lr [0.0001500], embedding loss [ 0.8256], quantization loss [ 0.0122], 1.45 sec/batch.
2022-10-20 20:21:41,763 step [ 576], lr [0.0001500], embedding loss [ 0.8215], quantization loss [ 0.0132], 1.63 sec/batch.
2022-10-20 20:21:44,650 step [ 577], lr [0.0001500], embedding loss [ 0.8152], quantization loss [ 0.0124], 1.49 sec/batch.
2022-10-20 20:21:47,589 step [ 578], lr [0.0001500], embedding loss [ 0.8147], quantization loss [ 0.0129], 1.54 sec/batch.
2022-10-20 20:21:50,465 step [ 579], lr [0.0001500], embedding loss [ 0.8174], quantization loss [ 0.0138], 1.48 sec/batch.
2022-10-20 20:21:53,355 step [ 580], lr [0.0001500], embedding loss [ 0.8126], quantization loss [ 0.0120], 1.52 sec/batch.
2022-10-20 20:21:56,258 step [ 581], lr [0.0001500], embedding loss [ 0.8162], quantization loss [ 0.0133], 1.48 sec/batch.
2022-10-20 20:21:58,896 step [ 582], lr [0.0001500], embedding loss [ 0.8114], quantization loss [ 0.0135], 1.26 sec/batch.
2022-10-20 20:22:01,543 step [ 583], lr [0.0001500], embedding loss [ 0.8134], quantization loss [ 0.0127], 1.27 sec/batch.
2022-10-20 20:22:04,249 step [ 584], lr [0.0001500], embedding loss [ 0.8183], quantization loss [ 0.0140], 1.32 sec/batch.
2022-10-20 20:22:06,947 step [ 585], lr [0.0001500], embedding loss [ 0.8206], quantization loss [ 0.0129], 1.29 sec/batch.
2022-10-20 20:22:06,947 update codes and centers iter(1/1).
2022-10-20 20:22:10,858 number of update_code wrong: 0.
2022-10-20 20:22:13,580 non zero codewords: 512.
2022-10-20 20:22:13,580 finish center update, duration: 6.63 sec.
2022-10-20 20:22:15,964 step [ 586], lr [0.0001500], embedding loss [ 0.8167], quantization loss [ 0.0120], 1.09 sec/batch.
2022-10-20 20:22:18,462 step [ 587], lr [0.0001500], embedding loss [ 0.8185], quantization loss [ 0.0118], 1.08 sec/batch.
2022-10-20 20:22:21,154 step [ 588], lr [0.0001500], embedding loss [ 0.8187], quantization loss [ 0.0113], 1.26 sec/batch.
2022-10-20 20:22:23,603 step [ 589], lr [0.0001500], embedding loss [ 0.8188], quantization loss [ 0.0119], 1.06 sec/batch.
2022-10-20 20:22:26,010 step [ 590], lr [0.0001500], embedding loss [ 0.8209], quantization loss [ 0.0108], 1.05 sec/batch.
2022-10-20 20:22:28,584 step [ 591], lr [0.0001500], embedding loss [ 0.8137], quantization loss [ 0.0117], 1.19 sec/batch.
2022-10-20 20:22:31,052 step [ 592], lr [0.0001500], embedding loss [ 0.8245], quantization loss [ 0.0115], 1.09 sec/batch.
2022-10-20 20:22:33,462 step [ 593], lr [0.0001500], embedding loss [ 0.8280], quantization loss [ 0.0113], 1.04 sec/batch.
2022-10-20 20:22:35,911 step [ 594], lr [0.0001500], embedding loss [ 0.8218], quantization loss [ 0.0118], 1.09 sec/batch.
2022-10-20 20:22:38,543 step [ 595], lr [0.0001500], embedding loss [ 0.8227], quantization loss [ 0.0131], 1.21 sec/batch.
2022-10-20 20:22:40,997 step [ 596], lr [0.0001500], embedding loss [ 0.8153], quantization loss [ 0.0124], 1.10 sec/batch.
2022-10-20 20:22:43,454 step [ 597], lr [0.0001500], embedding loss [ 0.8178], quantization loss [ 0.0119], 1.06 sec/batch.
2022-10-20 20:22:45,943 step [ 598], lr [0.0001500], embedding loss [ 0.8281], quantization loss [ 0.0115], 1.11 sec/batch.
2022-10-20 20:22:48,547 step [ 599], lr [0.0001500], embedding loss [ 0.8283], quantization loss [ 0.0106], 1.22 sec/batch.
2022-10-20 20:22:50,967 step [ 600], lr [0.0001500], embedding loss [ 0.8145], quantization loss [ 0.0113], 1.10 sec/batch.
2022-10-20 20:22:53,395 step [ 601], lr [0.0000750], embedding loss [ 0.8247], quantization loss [ 0.0121], 1.03 sec/batch.
2022-10-20 20:22:55,895 step [ 602], lr [0.0000750], embedding loss [ 0.8173], quantization loss [ 0.0110], 1.11 sec/batch.
2022-10-20 20:22:58,757 step [ 603], lr [0.0000750], embedding loss [ 0.8210], quantization loss [ 0.0119], 1.20 sec/batch.
2022-10-20 20:23:02,575 step [ 604], lr [0.0000750], embedding loss [ 0.8139], quantization loss [ 0.0116], 2.40 sec/batch.
2022-10-20 20:23:05,701 step [ 605], lr [0.0000750], embedding loss [ 0.8245], quantization loss [ 0.0112], 1.74 sec/batch.
2022-10-20 20:23:08,611 step [ 606], lr [0.0000750], embedding loss [ 0.8172], quantization loss [ 0.0115], 1.54 sec/batch.
2022-10-20 20:23:11,536 step [ 607], lr [0.0000750], embedding loss [ 0.8277], quantization loss [ 0.0110], 1.54 sec/batch.
2022-10-20 20:23:14,463 step [ 608], lr [0.0000750], embedding loss [ 0.8258], quantization loss [ 0.0114], 1.56 sec/batch.
2022-10-20 20:23:17,480 step [ 609], lr [0.0000750], embedding loss [ 0.8224], quantization loss [ 0.0113], 1.58 sec/batch.
2022-10-20 20:23:20,476 step [ 610], lr [0.0000750], embedding loss [ 0.8193], quantization loss [ 0.0118], 1.61 sec/batch.
2022-10-20 20:23:23,400 step [ 611], lr [0.0000750], embedding loss [ 0.8242], quantization loss [ 0.0120], 1.53 sec/batch.
2022-10-20 20:23:26,337 step [ 612], lr [0.0000750], embedding loss [ 0.8142], quantization loss [ 0.0120], 1.56 sec/batch.
2022-10-20 20:23:29,281 step [ 613], lr [0.0000750], embedding loss [ 0.8183], quantization loss [ 0.0110], 1.55 sec/batch.
2022-10-20 20:23:32,200 step [ 614], lr [0.0000750], embedding loss [ 0.8175], quantization loss [ 0.0113], 1.56 sec/batch.
2022-10-20 20:23:35,118 step [ 615], lr [0.0000750], embedding loss [ 0.8284], quantization loss [ 0.0114], 1.52 sec/batch.
2022-10-20 20:23:38,045 step [ 616], lr [0.0000750], embedding loss [ 0.8180], quantization loss [ 0.0114], 1.56 sec/batch.
2022-10-20 20:23:40,981 step [ 617], lr [0.0000750], embedding loss [ 0.8209], quantization loss [ 0.0113], 1.53 sec/batch.
2022-10-20 20:23:43,926 step [ 618], lr [0.0000750], embedding loss [ 0.8210], quantization loss [ 0.0114], 1.58 sec/batch.
2022-10-20 20:23:46,822 step [ 619], lr [0.0000750], embedding loss [ 0.8176], quantization loss [ 0.0106], 1.50 sec/batch.
2022-10-20 20:23:49,921 step [ 620], lr [0.0000750], embedding loss [ 0.8176], quantization loss [ 0.0110], 1.58 sec/batch.
2022-10-20 20:23:52,869 step [ 621], lr [0.0000750], embedding loss [ 0.8154], quantization loss [ 0.0110], 1.55 sec/batch.
2022-10-20 20:23:55,831 step [ 622], lr [0.0000750], embedding loss [ 0.8154], quantization loss [ 0.0118], 1.56 sec/batch.
2022-10-20 20:23:58,735 step [ 623], lr [0.0000750], embedding loss [ 0.8159], quantization loss [ 0.0115], 1.52 sec/batch.
2022-10-20 20:24:01,682 step [ 624], lr [0.0000750], embedding loss [ 0.8208], quantization loss [ 0.0116], 1.58 sec/batch.
2022-10-20 20:24:04,633 step [ 625], lr [0.0000750], embedding loss [ 0.8266], quantization loss [ 0.0124], 1.56 sec/batch.
2022-10-20 20:24:07,602 step [ 626], lr [0.0000750], embedding loss [ 0.8209], quantization loss [ 0.0113], 1.58 sec/batch.
2022-10-20 20:24:10,562 step [ 627], lr [0.0000750], embedding loss [ 0.8130], quantization loss [ 0.0118], 1.53 sec/batch.
2022-10-20 20:24:13,531 step [ 628], lr [0.0000750], embedding loss [ 0.8262], quantization loss [ 0.0120], 1.56 sec/batch.
2022-10-20 20:24:16,421 step [ 629], lr [0.0000750], embedding loss [ 0.8155], quantization loss [ 0.0111], 1.51 sec/batch.
2022-10-20 20:24:19,374 step [ 630], lr [0.0000750], embedding loss [ 0.8192], quantization loss [ 0.0125], 1.57 sec/batch.
2022-10-20 20:24:22,313 step [ 631], lr [0.0000750], embedding loss [ 0.8088], quantization loss [ 0.0110], 1.54 sec/batch.
2022-10-20 20:24:25,346 step [ 632], lr [0.0000750], embedding loss [ 0.8200], quantization loss [ 0.0113], 1.59 sec/batch.
2022-10-20 20:24:28,245 step [ 633], lr [0.0000750], embedding loss [ 0.8190], quantization loss [ 0.0105], 1.50 sec/batch.
2022-10-20 20:24:31,151 step [ 634], lr [0.0000750], embedding loss [ 0.8252], quantization loss [ 0.0122], 1.52 sec/batch.
2022-10-20 20:24:34,067 step [ 635], lr [0.0000750], embedding loss [ 0.8146], quantization loss [ 0.0114], 1.52 sec/batch.
2022-10-20 20:24:37,011 step [ 636], lr [0.0000750], embedding loss [ 0.8170], quantization loss [ 0.0117], 1.58 sec/batch.
2022-10-20 20:24:39,951 step [ 637], lr [0.0000750], embedding loss [ 0.8181], quantization loss [ 0.0105], 1.56 sec/batch.
2022-10-20 20:24:42,891 step [ 638], lr [0.0000750], embedding loss [ 0.8174], quantization loss [ 0.0102], 1.55 sec/batch.
2022-10-20 20:24:45,836 step [ 639], lr [0.0000750], embedding loss [ 0.8175], quantization loss [ 0.0115], 1.54 sec/batch.
2022-10-20 20:24:48,847 step [ 640], lr [0.0000750], embedding loss [ 0.8159], quantization loss [ 0.0112], 1.61 sec/batch.
2022-10-20 20:24:51,776 step [ 641], lr [0.0000750], embedding loss [ 0.8258], quantization loss [ 0.0113], 1.52 sec/batch.
2022-10-20 20:24:54,717 step [ 642], lr [0.0000750], embedding loss [ 0.8192], quantization loss [ 0.0122], 1.57 sec/batch.
2022-10-20 20:24:57,626 step [ 643], lr [0.0000750], embedding loss [ 0.8148], quantization loss [ 0.0104], 1.53 sec/batch.
2022-10-20 20:25:00,545 step [ 644], lr [0.0000750], embedding loss [ 0.8183], quantization loss [ 0.0113], 1.55 sec/batch.
2022-10-20 20:25:03,463 step [ 645], lr [0.0000750], embedding loss [ 0.8239], quantization loss [ 0.0117], 1.53 sec/batch.
2022-10-20 20:25:06,446 step [ 646], lr [0.0000750], embedding loss [ 0.8219], quantization loss [ 0.0119], 1.58 sec/batch.
2022-10-20 20:25:09,380 step [ 647], lr [0.0000750], embedding loss [ 0.8229], quantization loss [ 0.0111], 1.50 sec/batch.
2022-10-20 20:25:12,267 step [ 648], lr [0.0000750], embedding loss [ 0.8228], quantization loss [ 0.0108], 1.52 sec/batch.
2022-10-20 20:25:15,121 step [ 649], lr [0.0000750], embedding loss [ 0.8129], quantization loss [ 0.0123], 1.48 sec/batch.
2022-10-20 20:25:18,006 step [ 650], lr [0.0000750], embedding loss [ 0.8134], quantization loss [ 0.0109], 1.52 sec/batch.
2022-10-20 20:25:20,903 step [ 651], lr [0.0000750], embedding loss [ 0.8240], quantization loss [ 0.0110], 1.52 sec/batch.
2022-10-20 20:25:23,826 step [ 652], lr [0.0000750], embedding loss [ 0.8240], quantization loss [ 0.0118], 1.55 sec/batch.
2022-10-20 20:25:26,709 step [ 653], lr [0.0000750], embedding loss [ 0.8269], quantization loss [ 0.0101], 1.50 sec/batch.
2022-10-20 20:25:29,726 step [ 654], lr [0.0000750], embedding loss [ 0.8207], quantization loss [ 0.0114], 1.58 sec/batch.
2022-10-20 20:25:32,623 step [ 655], lr [0.0000750], embedding loss [ 0.8236], quantization loss [ 0.0110], 1.50 sec/batch.
2022-10-20 20:25:35,540 step [ 656], lr [0.0000750], embedding loss [ 0.8162], quantization loss [ 0.0106], 1.54 sec/batch.
2022-10-20 20:25:38,475 step [ 657], lr [0.0000750], embedding loss [ 0.8221], quantization loss [ 0.0114], 1.53 sec/batch.
2022-10-20 20:25:41,445 step [ 658], lr [0.0000750], embedding loss [ 0.8159], quantization loss [ 0.0112], 1.55 sec/batch.
2022-10-20 20:25:44,352 step [ 659], lr [0.0000750], embedding loss [ 0.8251], quantization loss [ 0.0116], 1.50 sec/batch.
2022-10-20 20:25:47,262 step [ 660], lr [0.0000750], embedding loss [ 0.8166], quantization loss [ 0.0106], 1.50 sec/batch.
2022-10-20 20:25:49,918 step [ 661], lr [0.0000750], embedding loss [ 0.8211], quantization loss [ 0.0114], 1.27 sec/batch.
2022-10-20 20:25:52,854 step [ 662], lr [0.0000750], embedding loss [ 0.8244], quantization loss [ 0.0111], 1.55 sec/batch.
2022-10-20 20:25:55,753 step [ 663], lr [0.0000750], embedding loss [ 0.8236], quantization loss [ 0.0106], 1.51 sec/batch.
2022-10-20 20:25:58,783 step [ 664], lr [0.0000750], embedding loss [ 0.8167], quantization loss [ 0.0114], 1.60 sec/batch.
2022-10-20 20:26:01,747 step [ 665], lr [0.0000750], embedding loss [ 0.8204], quantization loss [ 0.0123], 1.55 sec/batch.
2022-10-20 20:26:04,696 step [ 666], lr [0.0000750], embedding loss [ 0.8316], quantization loss [ 0.0114], 1.57 sec/batch.
2022-10-20 20:26:07,633 step [ 667], lr [0.0000750], embedding loss [ 0.8174], quantization loss [ 0.0115], 1.55 sec/batch.
2022-10-20 20:26:10,573 step [ 668], lr [0.0000750], embedding loss [ 0.8200], quantization loss [ 0.0112], 1.56 sec/batch.
2022-10-20 20:26:13,526 step [ 669], lr [0.0000750], embedding loss [ 0.8190], quantization loss [ 0.0109], 1.55 sec/batch.
2022-10-20 20:26:16,497 step [ 670], lr [0.0000750], embedding loss [ 0.8200], quantization loss [ 0.0106], 1.59 sec/batch.
2022-10-20 20:26:19,458 step [ 671], lr [0.0000750], embedding loss [ 0.8165], quantization loss [ 0.0122], 1.55 sec/batch.
2022-10-20 20:26:22,393 step [ 672], lr [0.0000750], embedding loss [ 0.8131], quantization loss [ 0.0108], 1.54 sec/batch.
2022-10-20 20:26:25,334 step [ 673], lr [0.0000750], embedding loss [ 0.8161], quantization loss [ 0.0118], 1.54 sec/batch.
2022-10-20 20:26:28,306 step [ 674], lr [0.0000750], embedding loss [ 0.8129], quantization loss [ 0.0118], 1.59 sec/batch.
2022-10-20 20:26:31,242 step [ 675], lr [0.0000750], embedding loss [ 0.8079], quantization loss [ 0.0110], 1.53 sec/batch.
2022-10-20 20:26:34,226 step [ 676], lr [0.0000750], embedding loss [ 0.8171], quantization loss [ 0.0105], 1.58 sec/batch.
2022-10-20 20:26:37,161 step [ 677], lr [0.0000750], embedding loss [ 0.8170], quantization loss [ 0.0111], 1.53 sec/batch.
2022-10-20 20:26:40,127 step [ 678], lr [0.0000750], embedding loss [ 0.8150], quantization loss [ 0.0112], 1.55 sec/batch.
2022-10-20 20:26:42,830 step [ 679], lr [0.0000750], embedding loss [ 0.8232], quantization loss [ 0.0113], 1.31 sec/batch.
2022-10-20 20:26:46,048 step [ 680], lr [0.0000750], embedding loss [ 0.8201], quantization loss [ 0.0110], 1.76 sec/batch.
2022-10-20 20:26:49,243 step [ 681], lr [0.0000750], embedding loss [ 0.8174], quantization loss [ 0.0112], 1.70 sec/batch.
2022-10-20 20:26:52,422 step [ 682], lr [0.0000750], embedding loss [ 0.8183], quantization loss [ 0.0113], 1.81 sec/batch.
2022-10-20 20:26:55,424 step [ 683], lr [0.0000750], embedding loss [ 0.8137], quantization loss [ 0.0113], 1.54 sec/batch.
2022-10-20 20:26:58,421 step [ 684], lr [0.0000750], embedding loss [ 0.8186], quantization loss [ 0.0114], 1.57 sec/batch.
2022-10-20 20:27:01,360 step [ 685], lr [0.0000750], embedding loss [ 0.8145], quantization loss [ 0.0107], 1.53 sec/batch.
2022-10-20 20:27:04,398 step [ 686], lr [0.0000750], embedding loss [ 0.8245], quantization loss [ 0.0115], 1.58 sec/batch.
2022-10-20 20:27:07,352 step [ 687], lr [0.0000750], embedding loss [ 0.8283], quantization loss [ 0.0107], 1.53 sec/batch.
2022-10-20 20:27:10,333 step [ 688], lr [0.0000750], embedding loss [ 0.8173], quantization loss [ 0.0113], 1.57 sec/batch.
2022-10-20 20:27:13,272 step [ 689], lr [0.0000750], embedding loss [ 0.8199], quantization loss [ 0.0106], 1.52 sec/batch.
2022-10-20 20:27:16,243 step [ 690], lr [0.0000750], embedding loss [ 0.8163], quantization loss [ 0.0112], 1.58 sec/batch.
2022-10-20 20:27:19,138 step [ 691], lr [0.0000750], embedding loss [ 0.8219], quantization loss [ 0.0109], 1.47 sec/batch.
2022-10-20 20:27:22,146 step [ 692], lr [0.0000750], embedding loss [ 0.8179], quantization loss [ 0.0110], 1.56 sec/batch.
2022-10-20 20:27:25,052 step [ 693], lr [0.0000750], embedding loss [ 0.8138], quantization loss [ 0.0106], 1.52 sec/batch.
2022-10-20 20:27:27,990 step [ 694], lr [0.0000750], embedding loss [ 0.8194], quantization loss [ 0.0113], 1.55 sec/batch.
2022-10-20 20:27:30,930 step [ 695], lr [0.0000750], embedding loss [ 0.8171], quantization loss [ 0.0114], 1.54 sec/batch.
2022-10-20 20:27:33,340 step [ 696], lr [0.0000750], embedding loss [ 0.8153], quantization loss [ 0.0117], 1.03 sec/batch.
2022-10-20 20:27:36,337 step [ 697], lr [0.0000750], embedding loss [ 0.8122], quantization loss [ 0.0102], 1.51 sec/batch.
2022-10-20 20:27:39,405 step [ 698], lr [0.0000750], embedding loss [ 0.8230], quantization loss [ 0.0109], 1.60 sec/batch.
2022-10-20 20:27:42,397 step [ 699], lr [0.0000750], embedding loss [ 0.8129], quantization loss [ 0.0109], 1.55 sec/batch.
2022-10-20 20:27:45,378 step [ 700], lr [0.0000750], embedding loss [ 0.8280], quantization loss [ 0.0126], 1.58 sec/batch.
2022-10-20 20:27:48,400 step [ 701], lr [0.0000750], embedding loss [ 0.8262], quantization loss [ 0.0109], 1.56 sec/batch.
2022-10-20 20:27:51,461 step [ 702], lr [0.0000750], embedding loss [ 0.8136], quantization loss [ 0.0117], 1.59 sec/batch.
2022-10-20 20:27:54,434 step [ 703], lr [0.0000750], embedding loss [ 0.8192], quantization loss [ 0.0118], 1.55 sec/batch.
2022-10-20 20:27:57,404 step [ 704], lr [0.0000750], embedding loss [ 0.8084], quantization loss [ 0.0116], 1.56 sec/batch.
2022-10-20 20:28:00,330 step [ 705], lr [0.0000750], embedding loss [ 0.8232], quantization loss [ 0.0113], 1.53 sec/batch.
2022-10-20 20:28:03,348 step [ 706], lr [0.0000750], embedding loss [ 0.8254], quantization loss [ 0.0110], 1.58 sec/batch.
2022-10-20 20:28:06,324 step [ 707], lr [0.0000750], embedding loss [ 0.8247], quantization loss [ 0.0102], 1.56 sec/batch.
2022-10-20 20:28:09,355 step [ 708], lr [0.0000750], embedding loss [ 0.8225], quantization loss [ 0.0114], 1.59 sec/batch.
2022-10-20 20:28:12,302 step [ 709], lr [0.0000750], embedding loss [ 0.8256], quantization loss [ 0.0104], 1.53 sec/batch.
2022-10-20 20:28:15,274 step [ 710], lr [0.0000750], embedding loss [ 0.8272], quantization loss [ 0.0108], 1.58 sec/batch.
2022-10-20 20:28:18,740 step [ 711], lr [0.0000750], embedding loss [ 0.8198], quantization loss [ 0.0114], 2.06 sec/batch.
2022-10-20 20:28:21,700 step [ 712], lr [0.0000750], embedding loss [ 0.8186], quantization loss [ 0.0108], 1.57 sec/batch.
2022-10-20 20:28:24,643 step [ 713], lr [0.0000750], embedding loss [ 0.8199], quantization loss [ 0.0108], 1.54 sec/batch.
2022-10-20 20:28:27,613 step [ 714], lr [0.0000750], embedding loss [ 0.8212], quantization loss [ 0.0102], 1.58 sec/batch.
2022-10-20 20:28:30,553 step [ 715], lr [0.0000750], embedding loss [ 0.8285], quantization loss [ 0.0107], 1.52 sec/batch.
2022-10-20 20:28:33,535 step [ 716], lr [0.0000750], embedding loss [ 0.8264], quantization loss [ 0.0113], 1.57 sec/batch.
2022-10-20 20:28:36,471 step [ 717], lr [0.0000750], embedding loss [ 0.8224], quantization loss [ 0.0119], 1.53 sec/batch.
2022-10-20 20:28:39,416 step [ 718], lr [0.0000750], embedding loss [ 0.8166], quantization loss [ 0.0116], 1.55 sec/batch.
2022-10-20 20:28:42,368 step [ 719], lr [0.0000750], embedding loss [ 0.8096], quantization loss [ 0.0120], 1.54 sec/batch.
2022-10-20 20:28:45,382 step [ 720], lr [0.0000750], embedding loss [ 0.8178], quantization loss [ 0.0109], 1.62 sec/batch.
2022-10-20 20:28:48,314 step [ 721], lr [0.0000750], embedding loss [ 0.8296], quantization loss [ 0.0111], 1.52 sec/batch.
2022-10-20 20:28:51,264 step [ 722], lr [0.0000750], embedding loss [ 0.8297], quantization loss [ 0.0119], 1.56 sec/batch.
2022-10-20 20:28:54,191 step [ 723], lr [0.0000750], embedding loss [ 0.8224], quantization loss [ 0.0111], 1.52 sec/batch.
2022-10-20 20:28:57,105 step [ 724], lr [0.0000750], embedding loss [ 0.8183], quantization loss [ 0.0110], 1.54 sec/batch.
2022-10-20 20:28:59,984 step [ 725], lr [0.0000750], embedding loss [ 0.8187], quantization loss [ 0.0109], 1.49 sec/batch.
2022-10-20 20:29:02,898 step [ 726], lr [0.0000750], embedding loss [ 0.8289], quantization loss [ 0.0111], 1.52 sec/batch.
2022-10-20 20:29:05,855 step [ 727], lr [0.0000750], embedding loss [ 0.8261], quantization loss [ 0.0115], 1.55 sec/batch.
2022-10-20 20:29:08,802 step [ 728], lr [0.0000750], embedding loss [ 0.8180], quantization loss [ 0.0115], 1.56 sec/batch.
2022-10-20 20:29:11,718 step [ 729], lr [0.0000750], embedding loss [ 0.8235], quantization loss [ 0.0112], 1.51 sec/batch.
2022-10-20 20:29:14,675 step [ 730], lr [0.0000750], embedding loss [ 0.8259], quantization loss [ 0.0104], 1.56 sec/batch.
2022-10-20 20:29:17,610 step [ 731], lr [0.0000750], embedding loss [ 0.8183], quantization loss [ 0.0118], 1.53 sec/batch.
2022-10-20 20:29:17,611 update codes and centers iter(1/1).
2022-10-20 20:29:21,573 number of update_code wrong: 0.
2022-10-20 20:29:24,413 non zero codewords: 512.
2022-10-20 20:29:24,414 finish center update, duration: 6.80 sec.
2022-10-20 20:29:27,255 step [ 732], lr [0.0000750], embedding loss [ 0.8166], quantization loss [ 0.0104], 1.51 sec/batch.
2022-10-20 20:29:30,139 step [ 733], lr [0.0000750], embedding loss [ 0.8199], quantization loss [ 0.0103], 1.50 sec/batch.
2022-10-20 20:29:33,109 step [ 734], lr [0.0000750], embedding loss [ 0.8243], quantization loss [ 0.0102], 1.58 sec/batch.
2022-10-20 20:29:36,048 step [ 735], lr [0.0000750], embedding loss [ 0.8132], quantization loss [ 0.0104], 1.51 sec/batch.
2022-10-20 20:29:39,169 step [ 736], lr [0.0000750], embedding loss [ 0.8275], quantization loss [ 0.0104], 1.74 sec/batch.
2022-10-20 20:29:42,100 step [ 737], lr [0.0000750], embedding loss [ 0.8175], quantization loss [ 0.0108], 1.53 sec/batch.
2022-10-20 20:29:45,046 step [ 738], lr [0.0000750], embedding loss [ 0.8265], quantization loss [ 0.0104], 1.56 sec/batch.
2022-10-20 20:29:47,752 step [ 739], lr [0.0000750], embedding loss [ 0.8233], quantization loss [ 0.0098], 1.29 sec/batch.
2022-10-20 20:29:50,636 step [ 740], lr [0.0000750], embedding loss [ 0.8108], quantization loss [ 0.0097], 1.51 sec/batch.
2022-10-20 20:29:53,589 step [ 741], lr [0.0000750], embedding loss [ 0.8162], quantization loss [ 0.0097], 1.55 sec/batch.
2022-10-20 20:29:56,571 step [ 742], lr [0.0000750], embedding loss [ 0.8207], quantization loss [ 0.0107], 1.58 sec/batch.
2022-10-20 20:29:59,510 step [ 743], lr [0.0000750], embedding loss [ 0.8278], quantization loss [ 0.0100], 1.52 sec/batch.
2022-10-20 20:30:02,551 step [ 744], lr [0.0000750], embedding loss [ 0.8185], quantization loss [ 0.0100], 1.61 sec/batch.
2022-10-20 20:30:05,610 step [ 745], lr [0.0000750], embedding loss [ 0.8092], quantization loss [ 0.0101], 1.59 sec/batch.
2022-10-20 20:30:08,622 step [ 746], lr [0.0000750], embedding loss [ 0.8193], quantization loss [ 0.0103], 1.60 sec/batch.
2022-10-20 20:30:11,593 step [ 747], lr [0.0000750], embedding loss [ 0.8139], quantization loss [ 0.0111], 1.55 sec/batch.
2022-10-20 20:30:14,611 step [ 748], lr [0.0000750], embedding loss [ 0.8206], quantization loss [ 0.0099], 1.60 sec/batch.
2022-10-20 20:30:17,684 step [ 749], lr [0.0000750], embedding loss [ 0.8148], quantization loss [ 0.0094], 1.60 sec/batch.
2022-10-20 20:30:20,696 step [ 750], lr [0.0000750], embedding loss [ 0.8157], quantization loss [ 0.0096], 1.61 sec/batch.
2022-10-20 20:30:23,669 step [ 751], lr [0.0000750], embedding loss [ 0.8256], quantization loss [ 0.0097], 1.56 sec/batch.
2022-10-20 20:30:26,663 step [ 752], lr [0.0000750], embedding loss [ 0.8193], quantization loss [ 0.0099], 1.59 sec/batch.
2022-10-20 20:30:29,728 step [ 753], lr [0.0000750], embedding loss [ 0.8262], quantization loss [ 0.0095], 1.57 sec/batch.
2022-10-20 20:30:32,729 step [ 754], lr [0.0000750], embedding loss [ 0.8203], quantization loss [ 0.0102], 1.57 sec/batch.
2022-10-20 20:30:35,749 step [ 755], lr [0.0000750], embedding loss [ 0.8208], quantization loss [ 0.0103], 1.58 sec/batch.
2022-10-20 20:30:38,681 step [ 756], lr [0.0000750], embedding loss [ 0.8206], quantization loss [ 0.0103], 1.53 sec/batch.
2022-10-20 20:30:41,620 step [ 757], lr [0.0000750], embedding loss [ 0.8177], quantization loss [ 0.0105], 1.53 sec/batch.
2022-10-20 20:30:44,611 step [ 758], lr [0.0000750], embedding loss [ 0.8242], quantization loss [ 0.0098], 1.59 sec/batch.
2022-10-20 20:30:47,561 step [ 759], lr [0.0000750], embedding loss [ 0.8239], quantization loss [ 0.0101], 1.55 sec/batch.
2022-10-20 20:30:50,652 step [ 760], lr [0.0000750], embedding loss [ 0.8165], quantization loss [ 0.0107], 1.62 sec/batch.
2022-10-20 20:30:53,603 step [ 761], lr [0.0000750], embedding loss [ 0.8232], quantization loss [ 0.0104], 1.54 sec/batch.
2022-10-20 20:30:56,576 step [ 762], lr [0.0000750], embedding loss [ 0.8123], quantization loss [ 0.0095], 1.56 sec/batch.
2022-10-20 20:30:59,495 step [ 763], lr [0.0000750], embedding loss [ 0.8229], quantization loss [ 0.0095], 1.50 sec/batch.
2022-10-20 20:31:02,407 step [ 764], lr [0.0000750], embedding loss [ 0.8204], quantization loss [ 0.0102], 1.53 sec/batch.
2022-10-20 20:31:05,330 step [ 765], lr [0.0000750], embedding loss [ 0.8113], quantization loss [ 0.0096], 1.52 sec/batch.
2022-10-20 20:31:08,366 step [ 766], lr [0.0000750], embedding loss [ 0.8140], quantization loss [ 0.0095], 1.56 sec/batch.
2022-10-20 20:31:11,363 step [ 767], lr [0.0000750], embedding loss [ 0.8159], quantization loss [ 0.0101], 1.56 sec/batch.
2022-10-20 20:31:14,279 step [ 768], lr [0.0000750], embedding loss [ 0.8220], quantization loss [ 0.0106], 1.52 sec/batch.
2022-10-20 20:31:17,244 step [ 769], lr [0.0000750], embedding loss [ 0.8151], quantization loss [ 0.0105], 1.55 sec/batch.
2022-10-20 20:31:20,181 step [ 770], lr [0.0000750], embedding loss [ 0.8236], quantization loss [ 0.0101], 1.53 sec/batch.
2022-10-20 20:31:23,108 step [ 771], lr [0.0000750], embedding loss [ 0.8163], quantization loss [ 0.0096], 1.51 sec/batch.
2022-10-20 20:31:26,102 step [ 772], lr [0.0000750], embedding loss [ 0.8168], quantization loss [ 0.0103], 1.58 sec/batch.
2022-10-20 20:31:29,054 step [ 773], lr [0.0000750], embedding loss [ 0.8247], quantization loss [ 0.0092], 1.53 sec/batch.
2022-10-20 20:31:32,028 step [ 774], lr [0.0000750], embedding loss [ 0.8278], quantization loss [ 0.0114], 1.58 sec/batch.
2022-10-20 20:31:34,980 step [ 775], lr [0.0000750], embedding loss [ 0.8143], quantization loss [ 0.0104], 1.54 sec/batch.
2022-10-20 20:31:37,985 step [ 776], lr [0.0000750], embedding loss [ 0.8177], quantization loss [ 0.0106], 1.60 sec/batch.
2022-10-20 20:31:40,934 step [ 777], lr [0.0000750], embedding loss [ 0.8255], quantization loss [ 0.0103], 1.53 sec/batch.
2022-10-20 20:31:43,920 step [ 778], lr [0.0000750], embedding loss [ 0.8189], quantization loss [ 0.0096], 1.58 sec/batch.
2022-10-20 20:31:46,874 step [ 779], lr [0.0000750], embedding loss [ 0.8127], quantization loss [ 0.0099], 1.54 sec/batch.
2022-10-20 20:31:49,870 step [ 780], lr [0.0000750], embedding loss [ 0.8219], quantization loss [ 0.0096], 1.59 sec/batch.
2022-10-20 20:31:52,927 step [ 781], lr [0.0000750], embedding loss [ 0.8190], quantization loss [ 0.0089], 1.57 sec/batch.
2022-10-20 20:31:55,927 step [ 782], lr [0.0000750], embedding loss [ 0.8120], quantization loss [ 0.0100], 1.57 sec/batch.
2022-10-20 20:31:58,962 step [ 783], lr [0.0000750], embedding loss [ 0.8165], quantization loss [ 0.0098], 1.55 sec/batch.
2022-10-20 20:32:01,984 step [ 784], lr [0.0000750], embedding loss [ 0.8309], quantization loss [ 0.0101], 1.59 sec/batch.
2022-10-20 20:32:04,977 step [ 785], lr [0.0000750], embedding loss [ 0.8288], quantization loss [ 0.0100], 1.53 sec/batch.
2022-10-20 20:32:07,973 step [ 786], lr [0.0000750], embedding loss [ 0.8213], quantization loss [ 0.0102], 1.57 sec/batch.
2022-10-20 20:32:10,950 step [ 787], lr [0.0000750], embedding loss [ 0.8189], quantization loss [ 0.0111], 1.54 sec/batch.
2022-10-20 20:32:13,947 step [ 788], lr [0.0000750], embedding loss [ 0.8213], quantization loss [ 0.0101], 1.58 sec/batch.
2022-10-20 20:32:16,916 step [ 789], lr [0.0000750], embedding loss [ 0.8173], quantization loss [ 0.0106], 1.54 sec/batch.
2022-10-20 20:32:19,908 step [ 790], lr [0.0000750], embedding loss [ 0.8196], quantization loss [ 0.0088], 1.58 sec/batch.
2022-10-20 20:32:22,938 step [ 791], lr [0.0000750], embedding loss [ 0.8196], quantization loss [ 0.0099], 1.52 sec/batch.
2022-10-20 20:32:25,919 step [ 792], lr [0.0000750], embedding loss [ 0.8243], quantization loss [ 0.0101], 1.58 sec/batch.
2022-10-20 20:32:29,215 step [ 793], lr [0.0000750], embedding loss [ 0.8184], quantization loss [ 0.0092], 1.88 sec/batch.
2022-10-20 20:32:32,145 step [ 794], lr [0.0000750], embedding loss [ 0.8313], quantization loss [ 0.0105], 1.53 sec/batch.
2022-10-20 20:32:35,098 step [ 795], lr [0.0000750], embedding loss [ 0.8176], quantization loss [ 0.0100], 1.54 sec/batch.
2022-10-20 20:32:38,083 step [ 796], lr [0.0000750], embedding loss [ 0.8126], quantization loss [ 0.0101], 1.57 sec/batch.
2022-10-20 20:32:41,060 step [ 797], lr [0.0000750], embedding loss [ 0.8264], quantization loss [ 0.0103], 1.55 sec/batch.
2022-10-20 20:32:44,044 step [ 798], lr [0.0000750], embedding loss [ 0.8158], quantization loss [ 0.0103], 1.56 sec/batch.
2022-10-20 20:32:47,022 step [ 799], lr [0.0000750], embedding loss [ 0.8250], quantization loss [ 0.0098], 1.52 sec/batch.
2022-10-20 20:32:50,004 step [ 800], lr [0.0000750], embedding loss [ 0.8114], quantization loss [ 0.0100], 1.57 sec/batch.
2022-10-20 20:32:50,005 finish training iterations and begin saving model.
2022-10-20 20:32:56,698 finish model saving.
2022-10-20 20:32:56,698 finish training, model saved under ./checkpoints/flickr_WSDQH_nbits=16_adaMargin_gamma=1_lambda=0.0001_0002.npy.
2022-10-20 20:33:00,230 prepare dataset.
2022-10-20 20:33:00,899 prepare data loader.
2022-10-20 20:33:00,899 Initializing DataLoader.
2022-10-20 20:33:00,899 DataLoader already.
2022-10-20 20:33:00,900 Initializing DataLoader.
2022-10-20 20:33:00,900 DataLoader already.
2022-10-20 20:33:00,900 prepare model.
2022-10-20 20:33:01,127 Number of semantic embeddings: 1178.
2022-10-20 20:33:18,362 begin validation.
2022-10-20 20:33:40,937 finish query feature extraction, duration: 22.58 sec.
2022-10-20 20:37:13,447 finish database feature extraction, duration: 212.51 sec.
2022-10-20 20:37:13,447 compute quantization codes for query.
2022-10-20 20:37:14,708 number of update_code wrong: 0.
2022-10-20 20:37:14,708 finish query encoding, duration: 1.26 sec.
2022-10-20 20:37:14,709 compute quantization codes for database.
2022-10-20 20:37:19,462 number of update_code wrong: 0.
2022-10-20 20:37:19,462 finish database encoding, duration: 4.75 sec.
2022-10-20 20:37:19,462 save retrieval information: codes, features, reconstructions of queries and database.
2022-10-20 20:37:20,506 begin to calculate MAP@5000.
2022-10-20 20:37:20,506 begin to calculate AQD mAP@5000.
2022-10-20 20:37:24,069 AQD mAP@5000 = [0.7548], duration: 3.56 sec.
2022-10-20 20:37:24,069 begin to calculate SQD mAP@5000.
2022-10-20 20:37:27,494 SQD mAP@5000 = [0.7531], duration: 3.42 sec.
2022-10-20 20:37:27,494 begin to calculate feats mAP@5000.
2022-10-20 20:37:30,952 feats mAP@5000 = [0.7564], duration: 3.46 sec.
2022-10-20 20:37:30,953 finish validation.