(vitis-ai-pytorch) Vitis-AI /workspace/storagex-cls-v0.1.0 > sh run_resnet50_224_qat.sh [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... 2022-11-07 02:43:48.457 | INFO | playcls.core.launch:_distributed_worker:116 - Rank 1 initialization finished. [VAIQ_NOTE]: Loading NNDCT kernels... 2022-11-07 02:43:48.498 | INFO | playcls.core.launch:_distributed_worker:116 - Rank 0 initialization finished. /workspace/storagex-cls-v0.1.0/playcls/quantize/QAT.py:74: UserWarning: You have chosen to seed training. This will turn on the CUDNN deterministic setting, which can slow down your training considerably! You may see unexpected behavior when restarting from checkpoints. "You have chosen to seed training. This will turn on the CUDNN deterministic setting, " 2022-11-07 02:43:52.707 | INFO | playcls.utils.setup_env:configure_omp:46 - *************************************************************** We set `OMP_NUM_THREADS` for each process to 1 to speed up. please further tune the variable for optimal performance. *************************************************************** 2022-11-07 02:43:52 | INFO | playcls.core.qat_trainer:156 - args: Namespace(batch_size=64, ckpt=None, devices=2, dist_backend='nccl', dist_url=None, machine_rank=0, num_machines=1, opts=[], quant_mode='train', resume=False, start_epoch=None, yml='cfg/resnet/zheta_cls3_resnet50_224_aug_qat.yml') 2022-11-07 02:43:52 | INFO | playcls.core.qat_trainer:157 - exp value: DATALOADER: NUM_WORKERS: 4 DATASETS: NAME: ImageNet NUM_CLASSES: 3 PATH: /workspace/data/Silan_FRD_22_09_28/3cls_10v0 EVAL_INTERVAL: 1 INPUT: AFFINE: ENABLED: False AUGMIX: ENABLED: False PROB: 0.1 AUTOAUG: ENABLED: False TYPE: imagenet CJ: BRIGHTNESS: 0.15 CONTRAST: 0.15 ENABLED: False HUE: 0.1 PROB: 0.5 SATURATION: 0.1 CROP: ENABLED: False RATIO: [0.75, 1.3333333333333333] SCALE: [0.16, 1] SIZE: [224, 224] HORIZON_FLIP: ENABLED: True PROB: 0.5 IN_CHANS: 3 PADDING: ENABLED: False MODE: constant SIZE: 10 REA: ENABLED: False PROB: 0.5 VALUE: [0.0, 0.0, 0.0] RPT: ENABLED: False PROB: 0.5 SIZE_TEST: [224, 224] SIZE_TRAIN: [224, 224] VERTICAL_FLIP: ENABLED: True PROB: 0.5 LOSS: LABEL_SMOOTH: True NAME: ce SMOOTH_EPS: 0.1 MODEL: AUX_LOGITS: True BN_EPS: None BN_MOMENTUM: None CHANNELS_LAST: False DEPTH: 50x DEVICE: cuda DROP: 0.0 DROP_BLOCK: None DROP_PATH: None EMA: True FEAT_DIM: 2048 GRAG_CKPT: False INCEPTION_LOSS_WEIGHTS: (1.0, 0.4) INITIAL_CKPT: LAST_STRIDE: 2 NAME: resnet NORM: BN OWN: True PIXEL_MEAN: [0.485, 0.465, 0.406] PIXEL_STD: [0.229, 0.224, 0.225] POOLING: None PRETRAIN: True PRETRAIN_PATH: /workspace/storagex-cls-v0.1.0/logs/resnet/zheta_cls3_res50_aug_224/best_ckpt.pth SYNC_BN: False TORCHSCRIPT: False TRANSFORM_INPUT: False OUTPUT_DIR: logs/ PRINT_INTERVAL: 10 SAVE_HISTORY_CKPT: False SEED: 42 SOLVER: ALPHA: 0.99 AMSGRAD: True BASE_LR_PER_IMAGE: 0.001 BETAS: (0.9, 0.999) EPS: 1e-07 GAMMA: 0.1 LAYER_DECAY: None MAX_EPOCH: 30 MILESTONES: [30, 60, 90] MOMENTUM: 0.9 NESTEROV: True OPT: SGD WARMUP_EPOCHS: 5 WARMUP_LR: 0 WARMUP_TYPE: warmmultistep WEIGHT_DECAY: 0.0001 TRICKS: DROP: 0.0 DROP_BLOCK: None DROP_PATH: None 2022-11-07 02:43:53 | INFO | playcls.quant_model.resnet:271 - Loading pretrained model from /workspace/storagex-cls-v0.1.0/logs/resnet/zheta_cls3_res50_aug_224/best_ckpt.pth 2022-11-07 02:43:53 | INFO | playcls.core.qat_trainer:179 - init prefetcher, this might take one minute or less... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Quant config file is empty, use default quant configuration [VAIQ_NOTE]: Quantization calibration process start up... [VAIQ_NOTE]: =>Quant Module is in 'cuda'. [VAIQ_NOTE]: =>Parsing DistributedDataParallel... [VAIQ_NOTE]: Quant config file is empty, use default quant configuration [VAIQ_NOTE]: Quantization calibration process start up... [VAIQ_NOTE]: =>Quant Module is in 'cuda'. [VAIQ_NOTE]: =>Parsing DistributedDataParallel... [VAIQ_NOTE]: Start to trace model... [VAIQ_NOTE]: Start to trace model... [VAIQ_NOTE]: Finish tracing. [VAIQ_NOTE]: Finish tracing. [VAIQ_NOTE]: Processing ops... ##2 | 8/181 [00:00<00:00, 3609.94it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn1]/input.11, type = batch_norm##4 | 9/181 [00:00<00:00, 3684.96it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/ReLU[relu1]/input.13, type = relu_] ##7 | 10/181 [00:00<00:00, 3878.23it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Conv2d[conv2]/input.15, type = _convolution### | 11/181 [00:00<00:00, 3832.01it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn2]/input.17, type = batch_nor###3 | 12/181 [00:00<00:00, 3878.53it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/ReLU[relu2]/input.19, type = relu_] ###5 | 13/181 [00:00<00:00, 4027.32it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Conv2d[conv3]/input.21, type = _convolution###8 | 14/181 [00:00<00:00, 4032.71it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn3]/8505, type = batch_norm] ####1 | 15/181 [00:00<00:00, 4057.69it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.23, ####4 | 16/181 [00:00<00:00, 4056.39it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/8530,####6 | 17/181 [00:00<00:00, 4060.78it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Add[add]/input.25, type = add] #####2 | 19/181 [00:00<00:00, 4016.32it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv1]/input.29, type = _convolution#####5 | 20/181 [00:00<00:00, 4024.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn1]/input.31, type = batch_nor#####8 | 21/181 [00:00<00:00, 4044.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/ReLU[relu1]/input.33, type = relu_] ###### | 22/181 [00:00<00:00, 4134.54it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv2]/input.35, type = _convolution######3 | 23/181 [00:00<00:00, 4106.64it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn2]/input.37, type = batch_nor######6 | 24/181 [00:00<00:00, 4127.74it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/ReLU[relu2]/input.39, type = relu_] ######9 | 25/181 [00:00<00:00, 4208.95it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv3]/input.41, type = _convolution#######1 | 26/181 [00:00<00:00, 4215.06it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn3]/8610, type = batch_norm] ######## | 29/181 [00:00<00:00, 4368.44it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv1]/input.47, type = _convolution########2 | 30/181 [00:00<00:00, 4371.04it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn1]/input.49, type = batch_nor########5 | 31/181 [00:00<00:00, 4380.11it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/ReLU[relu1]/input.51, type = relu_] ########8 | 32/181 [00:00<00:00, 4441.94it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv2]/input.53, type = _convolution#########1 | 33/181 [00:00<00:00, 4436.71it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn2]/input.55, type = batch_nor#########3 | 34/181 [00:00<00:00, 4443.81it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/ReLU[relu2]/input.57, type = relu_] #########6 | 35/181 [00:00<00:00, 4477.54it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv3]/input.59, type = _convolution#########9 | 36/181 [00:00<00:00, 4470.22it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn3]/8690, type = batch_norm] ##########7 | 39/181 [00:00<00:00, 4562.33it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv1]/input.65, type = _convolution########### | 40/181 [00:00<00:00, 4556.55it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn1]/input.67, type = batch_nor###########3 | 41/181 [00:00<00:00, 4560.48it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/ReLU[relu1]/input.69, type = relu_] ###########6 | 42/181 [00:00<00:00, 4604.55it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv2]/input.71, type = _convolution###########8 | 43/181 [00:00<00:00, 4596.91it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn2]/input.73, type = batch_nor############1 | 44/181 [00:00<00:00, 4600.97it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/ReLU[relu2]/input.75, type = relu_] ############4 | 45/181 [00:00<00:00, 4643.60it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv3]/input.77, type = _convolution############7 | 46/181 [00:00<00:00, 4637.15it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn3]/8770, type = batch_norm] ############9 | 47/181 [00:00<00:00, 4622.96it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.79, #############2 | 48/181 [00:00<00:00, 4607.96it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/8795,#############5 | 49/181 [00:00<00:00, 4608.61it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Add[add]/input.81, type = add] ############## | 51/181 [00:00<00:00, 4678.17it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv1]/input.85, type = _convolution##############3 | 52/181 [00:00<00:00, 4672.32it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn1]/input.87, type = batch_nor##############6 | 53/181 [00:00<00:00, 4674.06it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/ReLU[relu1]/input.89, type = relu_] ##############9 | 54/181 [00:00<00:00, 4709.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv2]/input.91, type = _convolution###############1 | 55/181 [00:00<00:00, 4698.59it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn2]/input.93, type = batch_nor###############4 | 56/181 [00:00<00:00, 4696.31it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/ReLU[relu2]/input.95, type = relu_] ###############7 | 57/181 [00:00<00:00, 4727.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv3]/input.97, type = _convolution################ | 58/181 [00:00<00:00, 4713.34it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn3]/8875, type = batch_norm] ################8 | 61/181 [00:00<00:00, 4753.94it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv1]/input.103, type = _convolutio#################1 | 62/181 [00:00<00:00, 4745.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn1]/input.105, type = batch_no#################4 | 63/181 [00:00<00:00, 4744.86it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/ReLU[relu1]/input.107, type = relu_] #################6 | 64/181 [00:00<00:00, 4774.39it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv2]/input.109, type = _convolutio#################9 | 65/181 [00:00<00:00, 4767.09it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn2]/input.111, type = batch_no##################2 | 66/181 [00:00<00:00, 4763.30it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/ReLU[relu2]/input.113, type = relu_] ##################5 | 67/181 [00:00<00:00, 4791.61it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv3]/input.115, type = _convolutio##################7 | 68/181 [00:00<00:00, 4777.75it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn3]/8955, type = batch_norm] ###################6 | 71/181 [00:00<00:00, 4822.44it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv1]/input.121, type = _convolutio###################8 | 72/181 [00:00<00:00, 4815.43it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn1]/input.123, type = batch_no####################1 | 73/181 [00:00<00:00, 4801.01it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/ReLU[relu1]/input.125, type = relu_] ####################4 | 74/181 [00:00<00:00, 4822.01it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv2]/input.127, type = _convolutio####################7 | 75/181 [00:00<00:00, 4813.22it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn2]/input.129, type = batch_no####################9 | 76/181 [00:00<00:00, 4809.69it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/ReLU[relu2]/input.131, type = relu_] #####################2 | 77/181 [00:00<00:00, 4833.02it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv3]/input.133, type = _convolutio#####################5 | 78/181 [00:00<00:00, 4822.89it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn3]/9035, type = batch_norm] ######################3 | 81/181 [00:00<00:00, 4859.52it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv1]/input.139, type = _convolutio######################6 | 82/181 [00:00<00:00, 4852.67it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn1]/input.141, type = batch_no######################9 | 83/181 [00:00<00:00, 4850.19it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/ReLU[relu1]/input.143, type = relu_] #######################2 | 84/181 [00:00<00:00, 4871.30it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv2]/input.145, type = _convolutio#######################4 | 85/181 [00:00<00:00, 4862.73it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn2]/input.147, type = batch_no#######################7 | 86/181 [00:00<00:00, 4848.58it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/ReLU[relu2]/input.149, type = relu_] ######################## | 87/181 [00:00<00:00, 4866.95it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv3]/input.151, type = _convolutio########################3 | 88/181 [00:00<00:00, 4858.93it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn3]/9115, type = batch_norm] ########################5 | 89/181 [00:00<00:00, 4855.72it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.153,########################8 | 90/181 [00:00<00:00, 4846.11it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/9140,#########################1 | 91/181 [00:00<00:00, 4836.31it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Add[add]/input.155, type = add] #########################6 | 93/181 [00:00<00:00, 4871.55it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv1]/input.159, type = _convolutio#########################9 | 94/181 [00:00<00:00, 4865.72it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn1]/input.161, type = batch_no##########################2 | 95/181 [00:00<00:00, 4863.35it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/ReLU[relu1]/input.163, type = relu_] ##########################5 | 96/181 [00:00<00:00, 4880.70it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv2]/input.165, type = _convolutio##########################7 | 97/181 [00:00<00:00, 4874.12it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn2]/input.167, type = batch_no########################### | 98/181 [00:00<00:00, 4857.96it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/ReLU[relu2]/input.169, type = relu_] ###########################3 | 99/181 [00:00<00:00, 4872.75it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv3]/input.171, type = _convolutio###########################6 | 100/181 [00:00<00:00, 4864.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn3]/9220, type = batch_norm] ############################4 | 103/181 [00:00<00:00, 4894.95it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv1]/input.177, type = _convoluti############################7 | 104/181 [00:00<00:00, 4889.12it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn1]/input.179, type = batch_n############################# | 105/181 [00:00<00:00, 4887.00it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/ReLU[relu1]/input.181, type = relu_] #############################2 | 106/181 [00:00<00:00, 4903.56it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv2]/input.183, type = _convoluti#############################5 | 107/181 [00:00<00:00, 4896.25it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn2]/input.185, type = batch_n#############################8 | 108/181 [00:00<00:00, 4893.59it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/ReLU[relu2]/input.187, type = relu_] ##############################1 | 109/181 [00:00<00:00, 4909.78it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv3]/input.189, type = _convoluti##############################3 | 110/181 [00:00<00:00, 4903.64it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn3]/9300, type = batch_norm] ###############################2 | 113/181 [00:00<00:00, 4919.21it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv1]/input.195, type = _convoluti###############################4 | 114/181 [00:00<00:00, 4912.98it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn1]/input.197, type = batch_n###############################7 | 115/181 [00:00<00:00, 4909.21it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/ReLU[relu1]/input.199, type = relu_] ################################ | 116/181 [00:00<00:00, 4924.49it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv2]/input.201, type = _convoluti################################3 | 117/181 [00:00<00:00, 4917.86it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn2]/input.203, type = batch_n################################5 | 118/181 [00:00<00:00, 4914.58it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/ReLU[relu2]/input.205, type = relu_] ################################8 | 119/181 [00:00<00:00, 4925.13it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv3]/input.207, type = _convoluti#################################1 | 120/181 [00:00<00:00, 4917.17it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn3]/9380, type = batch_norm] #################################9 | 123/181 [00:00<00:00, 4938.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv1]/input.213, type = _convoluti##################################2 | 124/181 [00:00<00:00, 4924.94it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn1]/input.215, type = batch_n##################################5 | 125/181 [00:00<00:00, 4919.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/ReLU[relu1]/input.217, type = relu_] ##################################8 | 126/181 [00:00<00:00, 4931.80it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv2]/input.219, type = _convoluti################################### | 127/181 [00:00<00:00, 4925.03it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn2]/input.221, type = batch_norm] [VAIQ_NOTE]: Processing ops... ###################################3 | 128/181 [00:00<00:00, 4921.67it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/ReLU[relu2]/input.223, type = relu_] ###################################6 | 129/181 [00:00<00:00, 4934.75it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv3]/input.225, type = _convoluti###################################9 | 130/181 [00:00<00:00, 4928.32it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn3]/9460, type = batch_norm] ####################################7 | 133/181 [00:00<00:00, 4947.34it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv1]/input.231, type = _convoluti##################################### | 134/181 [00:00<00:00, 4941.16it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn1]/input.233, type = batch_n#####################################2 | 135/181 [00:00<00:00, 4935.59it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/ReLU[relu1]/input.235, type = relu_] #####################################5 | 136/181 [00:00<00:00, 4943.50it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv2]/input.237, type = _convoluti#####################################8 | 137/181 [00:00<00:00, 4922.09it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn2]/input.239, type = batch_n######################################1 | 138/181 [00:00<00:00, 4914.16it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/ReLU[relu2]/input.241, type = relu_] ######################################3 | 139/181 [00:00<00:00, 4921.85it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv3]/input.243, type = _convoluti######################################6 | 140/181 [00:00<00:00, 4914.03it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn3]/9540, type = batch_norm] #######################################5 | 143/181 [00:00<00:00, 4931.19it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv1]/input.249, type = _convoluti#######################################7 | 144/181 [00:00<00:00, 4926.18it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn1]/input.251, type = batch_n######################################## | 145/181 [00:00<00:00, 4922.14it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/ReLU[relu1]/input.253, type = relu_] ########################################3 | 146/181 [00:00<00:00, 4933.12it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv2]/input.255, type = _convoluti########################################6 | 147/181 [00:00<00:00, 4926.39it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn2]/input.257, type = batch_n########################################8 | 148/181 [00:00<00:00, 4922.66it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/ReLU[relu2]/input.259, type = relu_] #########################################1 | 149/181 [00:00<00:00, 4933.62it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv3]/input.261, type = _convoluti#########################################4 | 150/181 [00:00<00:00, 4858.91it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn3]/9620, type = batch_norm] #########################################7 | 151/181 [00:00<00:00, 4850.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.263#########################################9 | 152/181 [00:00<00:00, 4843.49it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/9645##########################################2 | 153/181 [00:00<00:00, 4839.73it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Add[add]/input.265, type = add] ##2 | 8/181 [00:00<00:00, 3732.42it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn1]/input.11, type = batch_norm##########################################8 | 155/181 [00:00<00:00, 4860.25it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv1]/input.269, type = _convoluti##4 | 9/181 [00:00<00:00, 3819.56it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/ReLU[relu1]/input.13, type = relu_] ########################################### | 156/181 [00:00<00:00, 4854.30it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn1]/input.271, type = batch_n###########################################3 | 157/181 [00:00<00:00, 4850.01it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/ReLU[relu1]/input.273, type = relu_] ###########################################6 | 158/181 [00:00<00:00, 4860.25it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv2]/input.275, type = _convoluti##7 | 10/181 [00:00<00:00, 4012.92it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Conv2d[conv2]/input.15, type = _convolution###########################################9 | 159/181 [00:00<00:00, 4852.96it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn2]/input.277, type = batch_n### | 11/181 [00:00<00:00, 3489.44it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn2]/input.17, type = batch_nor############################################1 | 160/181 [00:00<00:00, 4849.96it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/ReLU[relu2]/input.279, type = relu_] ###3 | 12/181 [00:00<00:00, 3564.82it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/ReLU[relu2]/input.19, type = relu_] ############################################4 | 161/181 [00:00<00:00, 4860.07it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv3]/input.281, type = _convoluti###5 | 13/181 [00:00<00:00, 3724.20it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Conv2d[conv3]/input.21, type = _convolution###8 | 14/181 [00:00<00:00, 3767.74it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn3]/8505, type = batch_norm] ############################################7 | 162/181 [00:00<00:00, 4848.80it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn3]/9725, type = batch_norm] ####1 | 15/181 [00:00<00:00, 3821.11it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.23, ####4 | 16/181 [00:00<00:00, 3822.12it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/8530,#############################################5 | 165/181 [00:00<00:00, 4863.05it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv1]/input.287, type = _convoluti####6 | 17/181 [00:00<00:00, 3854.23it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Add[add]/input.25, type = add] #############################################8 | 166/181 [00:00<00:00, 4857.36it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn1]/input.289, type = batch_n##############################################1 | 167/181 [00:00<00:00, 4852.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/ReLU[relu1]/input.291, type = relu_] ##############################################4 | 168/181 [00:00<00:00, 4862.42it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv2]/input.293, type = _convoluti#####2 | 19/181 [00:00<00:00, 3849.10it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv1]/input.29, type = _convolution##############################################6 | 169/181 [00:00<00:00, 4857.65it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn2]/input.295, type = batch_n#####5 | 20/181 [00:00<00:00, 3859.49it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn1]/input.31, type = batch_nor##############################################9 | 170/181 [00:00<00:00, 4854.68it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/ReLU[relu2]/input.297, type = relu_] #####8 | 21/181 [00:00<00:00, 3881.39it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/ReLU[relu1]/input.33, type = relu_] ###############################################2 | 171/181 [00:00<00:00, 4863.80it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv3]/input.299, type = _convoluti###### | 22/181 [00:00<00:00, 3961.14it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv2]/input.35, type = _convolution###############################################5 | 172/181 [00:00<00:00, 4858.70it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn3]/9805, type = batch_norm] ######3 | 23/181 [00:00<00:00, 3929.33it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn2]/input.37, type = batch_nor######6 | 24/181 [00:00<00:00, 3943.71it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/ReLU[relu2]/input.39, type = relu_] ######9 | 25/181 [00:00<00:00, 4013.84it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv3]/input.41, type = _convolution#######1 | 26/181 [00:00<00:00, 4010.15it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn3]/8610, type = batch_norm] ######## | 29/181 [00:00<00:00, 4127.97it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv1]/input.47, type = _convolution########2 | 30/181 [00:00<00:00, 4132.32it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn1]/input.49, type = batch_nor########5 | 31/181 [00:00<00:00, 4149.20it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/ReLU[relu1]/input.51, type = relu_] ########8 | 32/181 [00:00<00:00, 4212.07it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv2]/input.53, type = _convolution#########1 | 33/181 [00:00<00:00, 4215.51it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn2]/input.55, type = batch_nor#########3 | 34/181 [00:00<00:00, 4232.14it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/ReLU[relu2]/input.57, type = relu_] #########6 | 35/181 [00:00<00:00, 4263.12it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv3]/input.59, type = _convolution##################################################| 181/181 [00:00<00:00, 4776.84it/s, OpInfo: name = return_0, type = Return] #########9 | 36/181 [00:00<00:00, 4252.42it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn3]/8690, type = batch_norm] [VAIQ_ERROR]: Unsupported Ops: {'_record_function_enter'} ##########7 | 39/181 [00:00<00:00, 4339.51it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv1]/input.65, type = _convolution########### | 40/181 [00:00<00:00, 4330.95it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn1]/input.67, type = batch_nor###########3 | 41/181 [00:00<00:00, 4340.29it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/ReLU[relu1]/input.69, type = relu_] ###########6 | 42/181 [00:00<00:00, 4386.36it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv2]/input.71, type = _convolution###########8 | 43/181 [00:00<00:00, 4383.83it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn2]/input.73, type = batch_nor############1 | 44/181 [00:00<00:00, 4383.08it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/ReLU[relu2]/input.75, type = relu_] ############4 | 45/181 [00:00<00:00, 4417.12it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv3]/input.77, type = _convolution############7 | 46/181 [00:00<00:00, 4407.49it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn3]/8770, type = batch_norm] ############9 | 47/181 [00:00<00:00, 4344.51it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.79, #############2 | 48/181 [00:00<00:00, 4324.77it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/8795,#############5 | 49/181 [00:00<00:00, 4320.39it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Add[add]/input.81, type = add] ############## | 51/181 [00:00<00:00, 4378.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv1]/input.85, type = _convolution##############3 | 52/181 [00:00<00:00, 4370.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn1]/input.87, type = batch_nor##############6 | 53/181 [00:00<00:00, 4370.53it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/ReLU[relu1]/input.89, type = relu_] ##############9 | 54/181 [00:00<00:00, 4400.56it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv2]/input.91, type = _convolution###############1 | 55/181 [00:00<00:00, 4388.10it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn2]/input.93, type = batch_nor###############4 | 56/181 [00:00<00:00, 4388.74it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/ReLU[relu2]/input.95, type = relu_] ###############7 | 57/181 [00:00<00:00, 4419.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv3]/input.97, type = _convolution################ | 58/181 [00:00<00:00, 4413.53it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn3]/8875, type = batch_norm] ################8 | 61/181 [00:00<00:00, 4455.42it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv1]/input.103, type = _convolutio#################1 | 62/181 [00:00<00:00, 4446.61it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn1]/input.105, type = batch_no#################4 | 63/181 [00:00<00:00, 4447.91it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/ReLU[relu1]/input.107, type = relu_] #################6 | 64/181 [00:00<00:00, 4470.79it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv2]/input.109, type = _convolutio#################9 | 65/181 [00:00<00:00, 4452.84it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn2]/input.111, type = batch_no##################2 | 66/181 [00:00<00:00, 4447.69it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/ReLU[relu2]/input.113, type = relu_] ##################5 | 67/181 [00:00<00:00, 4472.68it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv3]/input.115, type = _convolutio##################7 | 68/181 [00:00<00:00, 4463.35it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn3]/8955, type = batch_norm] ###################6 | 71/181 [00:00<00:00, 4503.05it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv1]/input.121, type = _convolutio###################8 | 72/181 [00:00<00:00, 4493.23it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn1]/input.123, type = batch_no####################1 | 73/181 [00:00<00:00, 4481.75it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/ReLU[relu1]/input.125, type = relu_] ####################4 | 74/181 [00:00<00:00, 4502.28it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv2]/input.127, type = _convolutio####################7 | 75/181 [00:00<00:00, 4494.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn2]/input.129, type = batch_no####################9 | 76/181 [00:00<00:00, 4494.24it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/ReLU[relu2]/input.131, type = relu_] #####################2 | 77/181 [00:00<00:00, 4516.63it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv3]/input.133, type = _convolutio#####################5 | 78/181 [00:00<00:00, 4508.08it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn3]/9035, type = batch_norm] ######################3 | 81/181 [00:00<00:00, 4544.03it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv1]/input.139, type = _convolutio######################6 | 82/181 [00:00<00:00, 4537.19it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn1]/input.141, type = batch_no######################9 | 83/181 [00:00<00:00, 4535.56it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/ReLU[relu1]/input.143, type = relu_] #######################2 | 84/181 [00:00<00:00, 4555.19it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv2]/input.145, type = _convolutio#######################4 | 85/181 [00:00<00:00, 4542.70it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn2]/input.147, type = batch_no#######################7 | 86/181 [00:00<00:00, 4528.92it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/ReLU[relu2]/input.149, type = relu_] ######################## | 87/181 [00:00<00:00, 4546.13it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv3]/input.151, type = _convolutio########################3 | 88/181 [00:00<00:00, 4537.56it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn3]/9115, type = batch_norm] ########################5 | 89/181 [00:00<00:00, 4534.60it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.153,########################8 | 90/181 [00:00<00:00, 4527.04it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/9140,#########################1 | 91/181 [00:00<00:00, 4525.08it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Add[add]/input.155, type = add] #########################6 | 93/181 [00:00<00:00, 4557.32it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv1]/input.159, type = _convolutio#########################9 | 94/181 [00:00<00:00, 4550.50it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn1]/input.161, type = batch_no##########################2 | 95/181 [00:00<00:00, 4545.92it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/ReLU[relu1]/input.163, type = relu_] ##########################5 | 96/181 [00:00<00:00, 4563.06it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv2]/input.165, type = _convolutio##########################7 | 97/181 [00:00<00:00, 4555.25it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn2]/input.167, type = batch_no########################### | 98/181 [00:00<00:00, 4544.21it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/ReLU[relu2]/input.169, type = relu_] ###########################3 | 99/181 [00:00<00:00, 4559.13it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv3]/input.171, type = _convolutio###########################6 | 100/181 [00:00<00:00, 4549.48it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn3]/9220, type = batch_norm] ############################4 | 103/181 [00:00<00:00, 4574.76it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv1]/input.177, type = _convoluti############################7 | 104/181 [00:00<00:00, 4567.67it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn1]/input.179, type = batch_n############################# | 105/181 [00:00<00:00, 4565.45it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/ReLU[relu1]/input.181, type = relu_] #############################2 | 106/181 [00:00<00:00, 4580.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv2]/input.183, type = _convoluti#############################5 | 107/181 [00:00<00:00, 4573.29it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn2]/input.185, type = batch_n#############################8 | 108/181 [00:00<00:00, 4569.88it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/ReLU[relu2]/input.187, type = relu_] ##############################1 | 109/181 [00:00<00:00, 4583.98it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv3]/input.189, type = _convoluti##############################3 | 110/181 [00:00<00:00, 4577.98it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn3]/9300, type = batch_norm] ###############################2 | 113/181 [00:00<00:00, 4593.89it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv1]/input.195, type = _convoluti###############################4 | 114/181 [00:00<00:00, 4587.37it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn1]/input.197, type = batch_n###############################7 | 115/181 [00:00<00:00, 4585.68it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/ReLU[relu1]/input.199, type = relu_] ################################ | 116/181 [00:00<00:00, 4599.02it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv2]/input.201, type = _convoluti################################3 | 117/181 [00:00<00:00, 4592.52it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn2]/input.203, type = batch_n################################5 | 118/181 [00:00<00:00, 4590.27it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/ReLU[relu2]/input.205, type = relu_] ################################8 | 119/181 [00:00<00:00, 4603.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv3]/input.207, type = _convoluti#################################1 | 120/181 [00:00<00:00, 4596.50it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn3]/9380, type = batch_norm] #################################9 | 123/181 [00:00<00:00, 4614.90it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv1]/input.213, type = _convoluti##################################2 | 124/181 [00:00<00:00, 4601.78it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn1]/input.215, type = batch_n##################################5 | 125/181 [00:00<00:00, 4597.04it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/ReLU[relu1]/input.217, type = relu_] ##################################8 | 126/181 [00:00<00:00, 4608.92it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv2]/input.219, type = _convoluti################################### | 127/181 [00:00<00:00, 4603.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn2]/input.221, type = batch_n###################################3 | 128/181 [00:00<00:00, 4600.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/ReLU[relu2]/input.223, type = relu_] ###################################6 | 129/181 [00:00<00:00, 4613.21it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv3]/input.225, type = _convoluti###################################9 | 130/181 [00:00<00:00, 4603.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn3]/9460, type = batch_norm] ####################################7 | 133/181 [00:00<00:00, 4623.34it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv1]/input.231, type = _convoluti##################################### | 134/181 [00:00<00:00, 4617.57it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn1]/input.233, type = batch_n#####################################2 | 135/181 [00:00<00:00, 4614.76it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/ReLU[relu1]/input.235, type = relu_] #####################################5 | 136/181 [00:00<00:00, 4626.10it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv2]/input.237, type = _convoluti#####################################8 | 137/181 [00:00<00:00, 4613.97it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn2]/input.239, type = batch_n######################################1 | 138/181 [00:00<00:00, 4610.37it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/ReLU[relu2]/input.241, type = relu_] ######################################3 | 139/181 [00:00<00:00, 4621.25it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv3]/input.243, type = _convoluti######################################6 | 140/181 [00:00<00:00, 4616.05it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn3]/9540, type = batch_norm] #######################################5 | 143/181 [00:00<00:00, 4630.90it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv1]/input.249, type = _convoluti#######################################7 | 144/181 [00:00<00:00, 4626.21it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn1]/input.251, type = batch_n######################################## | 145/181 [00:00<00:00, 4623.53it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/ReLU[relu1]/input.253, type = relu_] ########################################3 | 146/181 [00:00<00:00, 4634.31it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv2]/input.255, type = _convoluti########################################6 | 147/181 [00:00<00:00, 4629.09it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn2]/input.257, type = batch_n########################################8 | 148/181 [00:00<00:00, 4626.65it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/ReLU[relu2]/input.259, type = relu_] #########################################1 | 149/181 [00:00<00:00, 4635.72it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv3]/input.261, type = _convoluti#########################################4 | 150/181 [00:00<00:00, 4624.68it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn3]/9620, type = batch_norm] #########################################7 | 151/181 [00:00<00:00, 4620.69it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.263#########################################9 | 152/181 [00:00<00:00, 4613.96it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/9645##########################################2 | 153/181 [00:00<00:00, 4611.71it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Add[add]/input.265, type = add] ##########################################8 | 155/181 [00:00<00:00, 4630.63it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv1]/input.269, type = _convoluti########################################### | 156/181 [00:00<00:00, 4626.30it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn1]/input.271, type = batch_n###########################################3 | 157/181 [00:00<00:00, 4623.79it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/ReLU[relu1]/input.273, type = relu_] ###########################################6 | 158/181 [00:00<00:00, 4633.49it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv2]/input.275, type = _convoluti###########################################9 | 159/181 [00:00<00:00, 4628.57it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn2]/input.277, type = batch_n############################################1 | 160/181 [00:00<00:00, 4624.37it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/ReLU[relu2]/input.279, type = relu_] ############################################4 | 161/181 [00:00<00:00, 4632.33it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv3]/input.281, type = _convoluti############################################7 | 162/181 [00:00<00:00, 4550.14it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn3]/9725, type = batch_norm] #############################################5 | 165/181 [00:00<00:00, 4558.28it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv1]/input.287, type = _convoluti#############################################8 | 166/181 [00:00<00:00, 4554.02it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn1]/input.289, type = batch_n##############################################1 | 167/181 [00:00<00:00, 4552.33it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/ReLU[relu1]/input.291, type = relu_] ##############################################4 | 168/181 [00:00<00:00, 4561.80it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv2]/input.293, type = _convoluti##############################################6 | 169/181 [00:00<00:00, 4558.09it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn2]/input.295, type = batch_n##############################################9 | 170/181 [00:00<00:00, 4555.50it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/ReLU[relu2]/input.297, type = relu_] ###############################################2 | 171/181 [00:00<00:00, 4564.34it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv3]/input.299, type = _convoluti###############################################5 | 172/181 [00:00<00:00, 4560.12it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn3]/9805, type = batch_norm] ##################################################| 181/181 [00:00<00:00, 4480.64it/s, OpInfo: name = return_0, type = Return] [VAIQ_ERROR]: Unsupported Ops: {'_record_function_enter'} Traceback (most recent call last): File "playcls/quantize/QAT.py", line 101, in args=(cfg, args), File "./playcls/core/launch.py", line 95, in launch start_method=start_method, File "/opt/vitis_ai/conda/envs/vitis-ai-pytorch/lib/python3.7/site-packages/torch/multiprocessing/spawn.py", line 188, in start_processes while not context.join(): File "/opt/vitis_ai/conda/envs/vitis-ai-pytorch/lib/python3.7/site-packages/torch/multiprocessing/spawn.py", line 144, in join exit_code=exitcode torch.multiprocessing.spawn.ProcessExitedException: process 0 terminated with exit code 1 [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... 2022-11-07 02:44:09.428 | INFO | playcls.core.launch:_distributed_worker:116 - Rank 0 initialization finished. [VAIQ_NOTE]: Loading NNDCT kernels... 2022-11-07 02:44:09.448 | INFO | playcls.core.launch:_distributed_worker:116 - Rank 1 initialization finished. /workspace/storagex-cls-v0.1.0/playcls/quantize/QAT.py:74: UserWarning: You have chosen to seed training. This will turn on the CUDNN deterministic setting, which can slow down your training considerably! You may see unexpected behavior when restarting from checkpoints. "You have chosen to seed training. This will turn on the CUDNN deterministic setting, " 2022-11-07 02:44:12.639 | INFO | playcls.utils.setup_env:configure_omp:46 - *************************************************************** We set `OMP_NUM_THREADS` for each process to 1 to speed up. please further tune the variable for optimal performance. *************************************************************** 2022-11-07 02:44:12 | INFO | playcls.core.qat_trainer:156 - args: Namespace(batch_size=64, ckpt=None, devices=2, dist_backend='nccl', dist_url=None, machine_rank=0, num_machines=1, opts=[], quant_mode='deploy', resume=False, start_epoch=None, yml='cfg/resnet/zheta_cls3_resnet50_224_aug_qat.yml') 2022-11-07 02:44:12 | INFO | playcls.core.qat_trainer:157 - exp value: DATALOADER: NUM_WORKERS: 4 DATASETS: NAME: ImageNet NUM_CLASSES: 3 PATH: /workspace/data/Silan_FRD_22_09_28/3cls_10v0 EVAL_INTERVAL: 1 INPUT: AFFINE: ENABLED: False AUGMIX: ENABLED: False PROB: 0.1 AUTOAUG: ENABLED: False TYPE: imagenet CJ: BRIGHTNESS: 0.15 CONTRAST: 0.15 ENABLED: False HUE: 0.1 PROB: 0.5 SATURATION: 0.1 CROP: ENABLED: False RATIO: [0.75, 1.3333333333333333] SCALE: [0.16, 1] SIZE: [224, 224] HORIZON_FLIP: ENABLED: True PROB: 0.5 IN_CHANS: 3 PADDING: ENABLED: False MODE: constant SIZE: 10 REA: ENABLED: False PROB: 0.5 VALUE: [0.0, 0.0, 0.0] RPT: ENABLED: False PROB: 0.5 SIZE_TEST: [224, 224] SIZE_TRAIN: [224, 224] VERTICAL_FLIP: ENABLED: True PROB: 0.5 LOSS: LABEL_SMOOTH: True NAME: ce SMOOTH_EPS: 0.1 MODEL: AUX_LOGITS: True BN_EPS: None BN_MOMENTUM: None CHANNELS_LAST: False DEPTH: 50x DEVICE: cuda DROP: 0.0 DROP_BLOCK: None DROP_PATH: None EMA: True FEAT_DIM: 2048 GRAG_CKPT: False INCEPTION_LOSS_WEIGHTS: (1.0, 0.4) INITIAL_CKPT: LAST_STRIDE: 2 NAME: resnet NORM: BN OWN: True PIXEL_MEAN: [0.485, 0.465, 0.406] PIXEL_STD: [0.229, 0.224, 0.225] POOLING: None PRETRAIN: True PRETRAIN_PATH: /workspace/storagex-cls-v0.1.0/logs/resnet/zheta_cls3_res50_aug_224/best_ckpt.pth SYNC_BN: False TORCHSCRIPT: False TRANSFORM_INPUT: False OUTPUT_DIR: logs/ PRINT_INTERVAL: 10 SAVE_HISTORY_CKPT: False SEED: 42 SOLVER: ALPHA: 0.99 AMSGRAD: True BASE_LR_PER_IMAGE: 0.001 BETAS: (0.9, 0.999) EPS: 1e-07 GAMMA: 0.1 LAYER_DECAY: None MAX_EPOCH: 30 MILESTONES: [30, 60, 90] MOMENTUM: 0.9 NESTEROV: True OPT: SGD WARMUP_EPOCHS: 5 WARMUP_LR: 0 WARMUP_TYPE: warmmultistep WEIGHT_DECAY: 0.0001 TRICKS: DROP: 0.0 DROP_BLOCK: None DROP_PATH: None 2022-11-07 02:44:13 | INFO | playcls.quant_model.resnet:271 - Loading pretrained model from /workspace/storagex-cls-v0.1.0/logs/resnet/zheta_cls3_res50_aug_224/best_ckpt.pth 2022-11-07 02:44:13 | INFO | playcls.core.qat_trainer:179 - init prefetcher, this might take one minute or less... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Quant config file is empty, use default quant configuration [VAIQ_NOTE]: Quantization calibration process start up... [VAIQ_NOTE]: =>Quant Module is in 'cuda'. [VAIQ_NOTE]: Quant config file is empty, use default quant configuration [VAIQ_NOTE]: =>Parsing DistributedDataParallel... [VAIQ_NOTE]: Quantization calibration process start up... [VAIQ_NOTE]: =>Quant Module is in 'cuda'. [VAIQ_NOTE]: =>Parsing DistributedDataParallel... [VAIQ_NOTE]: Start to trace model... [VAIQ_NOTE]: Start to trace model... [VAIQ_NOTE]: Finish tracing. [VAIQ_NOTE]: Finish tracing. [VAIQ_NOTE]: Processing ops... ##2 | 8/181 [00:00<00:00, 3689.33it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn1]/input.11, type = batch_norm##4 | 9/181 [00:00<00:00, 3781.30it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/ReLU[relu1]/input.13, type = relu_] ##7 | 10/181 [00:00<00:00, 3989.26it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Conv2d[conv2]/input.15, type = _convolution### | 11/181 [00:00<00:00, 3454.69it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn2]/input.17, type = batch_nor###3 | 12/181 [00:00<00:00, 3527.84it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/ReLU[relu2]/input.19, type = relu_] ###5 | 13/181 [00:00<00:00, 3684.93it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Conv2d[conv3]/input.21, type = _convolution###8 | 14/181 [00:00<00:00, 3726.85it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn3]/8505, type = batch_norm] ####1 | 15/181 [00:00<00:00, 3787.98it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.23, ####4 | 16/181 [00:00<00:00, 3817.12it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/8530,####6 | 17/181 [00:00<00:00, 3866.77it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Add[add]/input.25, type = add] #####2 | 19/181 [00:00<00:00, 3882.10it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv1]/input.29, type = _convolution#####5 | 20/181 [00:00<00:00, 3888.48it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn1]/input.31, type = batch_nor#####8 | 21/181 [00:00<00:00, 3912.94it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/ReLU[relu1]/input.33, type = relu_] ###### | 22/181 [00:00<00:00, 4000.98it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv2]/input.35, type = _convolution######3 | 23/181 [00:00<00:00, 3984.84it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn2]/input.37, type = batch_nor######6 | 24/181 [00:00<00:00, 4012.89it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/ReLU[relu2]/input.39, type = relu_] ######9 | 25/181 [00:00<00:00, 4094.08it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv3]/input.41, type = _convolution#######1 | 26/181 [00:00<00:00, 4106.80it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn3]/8610, type = batch_norm] ######## | 29/181 [00:00<00:00, 4253.86it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv1]/input.47, type = _convolution########2 | 30/181 [00:00<00:00, 4259.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn1]/input.49, type = batch_nor########5 | 31/181 [00:00<00:00, 4277.09it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/ReLU[relu1]/input.51, type = relu_] ########8 | 32/181 [00:00<00:00, 4340.95it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv2]/input.53, type = _convolution#########1 | 33/181 [00:00<00:00, 4343.97it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn2]/input.55, type = batch_nor#########3 | 34/181 [00:00<00:00, 4357.85it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/ReLU[relu2]/input.57, type = relu_] #########6 | 35/181 [00:00<00:00, 4394.04it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv3]/input.59, type = _convolution#########9 | 36/181 [00:00<00:00, 4387.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn3]/8690, type = batch_norm] ##########7 | 39/181 [00:00<00:00, 4499.83it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv1]/input.65, type = _convolution########### | 40/181 [00:00<00:00, 4498.03it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn1]/input.67, type = batch_nor###########3 | 41/181 [00:00<00:00, 4502.45it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/ReLU[relu1]/input.69, type = relu_] ###########6 | 42/181 [00:00<00:00, 4545.38it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv2]/input.71, type = _convolution###########8 | 43/181 [00:00<00:00, 4542.49it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn2]/input.73, type = batch_nor############1 | 44/181 [00:00<00:00, 4549.25it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/ReLU[relu2]/input.75, type = relu_] ############4 | 45/181 [00:00<00:00, 4592.75it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv3]/input.77, type = _convolution############7 | 46/181 [00:00<00:00, 4590.92it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn3]/8770, type = batch_norm] ############9 | 47/181 [00:00<00:00, 4579.15it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.79, #############2 | 48/181 [00:00<00:00, 4569.48it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/8795,#############5 | 49/181 [00:00<00:00, 4568.15it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Add[add]/input.81, type = add] ############## | 51/181 [00:00<00:00, 4641.03it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv1]/input.85, type = _convolution##############3 | 52/181 [00:00<00:00, 4636.95it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn1]/input.87, type = batch_nor##############6 | 53/181 [00:00<00:00, 4637.30it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/ReLU[relu1]/input.89, type = relu_] ##############9 | 54/181 [00:00<00:00, 4673.13it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv2]/input.91, type = _convolution###############1 | 55/181 [00:00<00:00, 4667.88it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn2]/input.93, type = batch_nor###############4 | 56/181 [00:00<00:00, 4671.83it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/ReLU[relu2]/input.95, type = relu_] ###############7 | 57/181 [00:00<00:00, 4706.39it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv3]/input.97, type = _convolution################ | 58/181 [00:00<00:00, 4702.32it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn3]/8875, type = batch_norm] ################8 | 61/181 [00:00<00:00, 4748.30it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv1]/input.103, type = _convolutio#################1 | 62/181 [00:00<00:00, 4737.86it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn1]/input.105, type = batch_no#################4 | 63/181 [00:00<00:00, 4722.56it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/ReLU[relu1]/input.107, type = relu_] #################6 | 64/181 [00:00<00:00, 4719.74it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv2]/input.109, type = _convolutio#################9 | 65/181 [00:00<00:00, 4645.73it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn2]/input.111, type = batch_no##################2 | 66/181 [00:00<00:00, 4586.07it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/ReLU[relu2]/input.113, type = relu_] ##################5 | 67/181 [00:00<00:00, 4577.67it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv3]/input.115, type = _convolutio##################7 | 68/181 [00:00<00:00, 4511.00it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn3]/8955, type = batch_norm] ###################6 | 71/181 [00:00<00:00, 4519.86it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv1]/input.121, type = _convolutio###################8 | 72/181 [00:00<00:00, 4508.25it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn1]/input.123, type = batch_no####################1 | 73/181 [00:00<00:00, 4481.16it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/ReLU[relu1]/input.125, type = relu_] ####################4 | 74/181 [00:00<00:00, 4502.15it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv2]/input.127, type = _convolutio####################7 | 75/181 [00:00<00:00, 4498.65it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn2]/input.129, type = batch_no####################9 | 76/181 [00:00<00:00, 4500.83it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/ReLU[relu2]/input.131, type = relu_] #####################2 | 77/181 [00:00<00:00, 4524.16it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv3]/input.133, type = _convolutio#####################5 | 78/181 [00:00<00:00, 4517.73it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn3]/9035, type = batch_norm] ######################3 | 81/181 [00:00<00:00, 4563.19it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv1]/input.139, type = _convolutio######################6 | 82/181 [00:00<00:00, 4559.33it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn1]/input.141, type = batch_no######################9 | 83/181 [00:00<00:00, 4560.04it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/ReLU[relu1]/input.143, type = relu_] #######################2 | 84/181 [00:00<00:00, 4582.15it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv2]/input.145, type = _convolutio#######################4 | 85/181 [00:00<00:00, 4575.00it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn2]/input.147, type = batch_no#######################7 | 86/181 [00:00<00:00, 4560.76it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/ReLU[relu2]/input.149, type = relu_] ######################## | 87/181 [00:00<00:00, 4577.44it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv3]/input.151, type = _convolutio########################3 | 88/181 [00:00<00:00, 4569.18it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn3]/9115, type = batch_norm] ########################5 | 89/181 [00:00<00:00, 4566.95it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.153,########################8 | 90/181 [00:00<00:00, 4559.36it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/9140,#########################1 | 91/181 [00:00<00:00, 4556.58it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Add[add]/input.155, type = add] #########################6 | 93/181 [00:00<00:00, 4588.85it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv1]/input.159, type = _convolutio#########################9 | 94/181 [00:00<00:00, 4584.10it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn1]/input.161, type = batch_no##########################2 | 95/181 [00:00<00:00, 4580.15it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/ReLU[relu1]/input.163, type = relu_] ##########################5 | 96/181 [00:00<00:00, 4596.71it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv2]/input.165, type = _convolutio##########################7 | 97/181 [00:00<00:00, 4586.57it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn2]/input.167, type = batch_no########################### | 98/181 [00:00<00:00, 4572.06it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/ReLU[relu2]/input.169, type = relu_] ###########################3 | 99/181 [00:00<00:00, 4585.25it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv3]/input.171, type = _convolutio###########################6 | 100/181 [00:00<00:00, 4575.29it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn3]/9220, type = batch_norm] ############################4 | 103/181 [00:00<00:00, 4597.55it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv1]/input.177, type = _convoluti############################7 | 104/181 [00:00<00:00, 4590.74it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn1]/input.179, type = batch_n############################# | 105/181 [00:00<00:00, 4586.61it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/ReLU[relu1]/input.181, type = relu_] #############################2 | 106/181 [00:00<00:00, 4601.40it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv2]/input.183, type = _convoluti#############################5 | 107/181 [00:00<00:00, 4592.80it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn2]/input.185, type = batch_n#############################8 | 108/181 [00:00<00:00, 4587.79it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/ReLU[relu2]/input.187, type = relu_] ##############################1 | 109/181 [00:00<00:00, 4600.68it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv3]/input.189, type = _convoluti##############################3 | 110/181 [00:00<00:00, 4591.97it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn3]/9300, type = batch_norm] ###############################2 | 113/181 [00:00<00:00, 4602.81it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv1]/input.195, type = _convoluti###############################4 | 114/181 [00:00<00:00, 4594.38it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn1]/input.197, type = batch_n###############################7 | 115/181 [00:00<00:00, 4591.36it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/ReLU[relu1]/input.199, type = relu_] ################################ | 116/181 [00:00<00:00, 4603.11it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv2]/input.201, type = _convoluti################################3 | 117/181 [00:00<00:00, 4596.78it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn2]/input.203, type = batch_n################################5 | 118/181 [00:00<00:00, 4594.75it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/ReLU[relu2]/input.205, type = relu_] ################################8 | 119/181 [00:00<00:00, 4608.70it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv3]/input.207, type = _convoluti#################################1 | 120/181 [00:00<00:00, 4602.55it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn3]/9380, type = batch_norm] #################################9 | 123/181 [00:00<00:00, 4623.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv1]/input.213, type = _convoluti##################################2 | 124/181 [00:00<00:00, 4611.58it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn1]/input.215, type = batch_n##################################5 | 125/181 [00:00<00:00, 4607.51it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/ReLU[relu1]/input.217, type = relu_] ##################################8 | 126/181 [00:00<00:00, 4620.29it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv2]/input.219, type = _convoluti################################### | 127/181 [00:00<00:00, 4615.56it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn2]/input.221, type = batch_n###################################3 | 128/181 [00:00<00:00, 4612.73it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/ReLU[relu2]/input.223, type = relu_] ###################################6 | 129/181 [00:00<00:00, 4625.36it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv3]/input.225, type = _convoluti###################################9 | 130/181 [00:00<00:00, 4620.26it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn3]/9460, type = batch_norm] ####################################7 | 133/181 [00:00<00:00, 4639.37it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv1]/input.231, type = _convoluti##################################### | 134/181 [00:00<00:00, 4633.63it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn1]/input.233, type = batch_n#####################################2 | 135/181 [00:00<00:00, 4628.15it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/ReLU[relu1]/input.235, type = relu_] #####################################5 | 136/181 [00:00<00:00, 4638.89it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv2]/input.237, type = _convoluti#####################################8 | 137/181 [00:00<00:00, 4627.35it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn2]/input.239, type = batch_n######################################1 | 138/181 [00:00<00:00, 4623.74it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/ReLU[relu2]/input.241, type = relu_] ######################################3 | 139/181 [00:00<00:00, 4634.77it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv3]/input.243, type = _convoluti######################################6 | 140/181 [00:00<00:00, 4629.88it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn3]/9540, type = batch_norm] #######################################5 | 143/181 [00:00<00:00, 4648.49it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv1]/input.249, type = _convoluti#######################################7 | 144/181 [00:00<00:00, 4643.93it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn1]/input.251, type = batch_n######################################## | 145/181 [00:00<00:00, 4641.49it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/ReLU[relu1]/input.253, type = relu_] ########################################3 | 146/181 [00:00<00:00, 4652.94it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv2]/input.255, type = _convoluti########################################6 | 147/181 [00:00<00:00, 4648.15it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn2]/input.257, type = batch_n########################################8 | 148/181 [00:00<00:00, 4645.62it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/ReLU[relu2]/input.259, type = relu_] #########################################1 | 149/181 [00:00<00:00, 4656.69it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv3]/input.261, type = _convoluti#########################################4 | 150/181 [00:00<00:00, 4645.75it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn3]/9620, type = batch_norm] #########################################7 | 151/181 [00:00<00:00, 4641.62it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.263#########################################9 | 152/181 [00:00<00:00, 4635.37it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/9645##########################################2 | 153/181 [00:00<00:00, 4633.15it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Add[add]/input.265, type = add] ##########################################8 | 155/181 [00:00<00:00, 4650.94it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv1]/input.269, type = _convoluti########################################### | 156/181 [00:00<00:00, 4646.27it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn1]/input.271, type = batch_n###########################################3 | 157/181 [00:00<00:00, 4643.84it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/ReLU[relu1]/input.273, type = relu_] ###########################################6 | 158/181 [00:00<00:00, 4654.35it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv2]/input.275, type = _convoluti###########################################9 | 159/181 [00:00<00:00, 4649.68it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn2]/input.277, type = batch_n############################################1 | 160/181 [00:00<00:00, 4647.11it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/ReLU[relu2]/input.279, type = relu_] ############################################4 | 161/181 [00:00<00:00, 4656.77it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv3]/input.281, type = _convoluti############################################7 | 162/181 [00:00<00:00, 4587.13it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn3]/9725, type = batch_norm] #############################################5 | 165/181 [00:00<00:00, 4594.68it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv1]/input.287, type = _convoluti#############################################8 | 166/181 [00:00<00:00, 4590.41it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn1]/input.289, type = batch_n##############################################1 | 167/181 [00:00<00:00, 4587.90it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/ReLU[relu1]/input.291, type = relu_] ##############################################4 | 168/181 [00:00<00:00, 4597.70it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv2]/input.293, type = _convoluti##############################################6 | 169/181 [00:00<00:00, 4593.83it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn2]/input.295, type = batch_n##############################################9 | 170/181 [00:00<00:00, 4591.02it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/ReLU[relu2]/input.297, type = relu_] ###############################################2 | 171/181 [00:00<00:00, 4598.72it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv3]/input.299, type = _convoluti###############################################5 | 172/181 [00:00<00:00, 4593.75it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn3]/9805, type = batch_norm] ##################################################| 181/181 [00:00<00:00, 4517.52it/s, OpInfo: name = return_0, type = Return] [VAIQ_ERROR]: Unsupported Ops: {'_record_function_enter'} [VAIQ_NOTE]: Processing ops... ##2 | 8/181 [00:00<00:00, 3608.39it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn1]/input.11, type = batch_norm##4 | 9/181 [00:00<00:00, 3687.12it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/ReLU[relu1]/input.13, type = relu_] ##7 | 10/181 [00:00<00:00, 3892.63it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Conv2d[conv2]/input.15, type = _convolution### | 11/181 [00:00<00:00, 3858.61it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn2]/input.17, type = batch_nor###3 | 12/181 [00:00<00:00, 3920.52it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/ReLU[relu2]/input.19, type = relu_] ###5 | 13/181 [00:00<00:00, 4084.04it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Conv2d[conv3]/input.21, type = _convolution###8 | 14/181 [00:00<00:00, 4106.02it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn3]/8505, type = batch_norm] ####1 | 15/181 [00:00<00:00, 4149.22it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.23, ####4 | 16/181 [00:00<00:00, 4154.58it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/8530,####6 | 17/181 [00:00<00:00, 4190.36it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Add[add]/input.25, type = add] #####2 | 19/181 [00:00<00:00, 4117.80it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv1]/input.29, type = _convolution#####5 | 20/181 [00:00<00:00, 4114.48it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn1]/input.31, type = batch_nor#####8 | 21/181 [00:00<00:00, 4129.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/ReLU[relu1]/input.33, type = relu_] ###### | 22/181 [00:00<00:00, 4219.81it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv2]/input.35, type = _convolution######3 | 23/181 [00:00<00:00, 4191.39it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn2]/input.37, type = batch_nor######6 | 24/181 [00:00<00:00, 4213.44it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/ReLU[relu2]/input.39, type = relu_] ######9 | 25/181 [00:00<00:00, 4297.97it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv3]/input.41, type = _convolution#######1 | 26/181 [00:00<00:00, 4302.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn3]/8610, type = batch_norm] ######## | 29/181 [00:00<00:00, 4459.74it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv1]/input.47, type = _convolution########2 | 30/181 [00:00<00:00, 4461.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn1]/input.49, type = batch_nor########5 | 31/181 [00:00<00:00, 4467.70it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/ReLU[relu1]/input.51, type = relu_] ########8 | 32/181 [00:00<00:00, 4532.70it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv2]/input.53, type = _convolution#########1 | 33/181 [00:00<00:00, 4527.71it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn2]/input.55, type = batch_nor#########3 | 34/181 [00:00<00:00, 4537.56it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/ReLU[relu2]/input.57, type = relu_] #########6 | 35/181 [00:00<00:00, 4571.09it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv3]/input.59, type = _convolution#########9 | 36/181 [00:00<00:00, 4561.78it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn3]/8690, type = batch_norm] ##########7 | 39/181 [00:00<00:00, 4664.06it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv1]/input.65, type = _convolution########### | 40/181 [00:00<00:00, 4651.68it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn1]/input.67, type = batch_nor###########3 | 41/181 [00:00<00:00, 4656.17it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/ReLU[relu1]/input.69, type = relu_] ###########6 | 42/181 [00:00<00:00, 4701.63it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv2]/input.71, type = _convolution###########8 | 43/181 [00:00<00:00, 4692.71it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn2]/input.73, type = batch_nor############1 | 44/181 [00:00<00:00, 4695.79it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/ReLU[relu2]/input.75, type = relu_] ############4 | 45/181 [00:00<00:00, 4738.61it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv3]/input.77, type = _convolution############7 | 46/181 [00:00<00:00, 4731.54it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn3]/8770, type = batch_norm] ############9 | 47/181 [00:00<00:00, 4716.65it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.79, #############2 | 48/181 [00:00<00:00, 4693.58it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/8795,#############5 | 49/181 [00:00<00:00, 4690.76it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Add[add]/input.81, type = add] ############## | 51/181 [00:00<00:00, 4761.37it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv1]/input.85, type = _convolution##############3 | 52/181 [00:00<00:00, 4754.00it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn1]/input.87, type = batch_nor##############6 | 53/181 [00:00<00:00, 4754.73it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/ReLU[relu1]/input.89, type = relu_] ##############9 | 54/181 [00:00<00:00, 4791.06it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv2]/input.91, type = _convolution###############1 | 55/181 [00:00<00:00, 4783.85it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn2]/input.93, type = batch_nor###############4 | 56/181 [00:00<00:00, 4784.21it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/ReLU[relu2]/input.95, type = relu_] ###############7 | 57/181 [00:00<00:00, 4819.19it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv3]/input.97, type = _convolution################ | 58/181 [00:00<00:00, 4807.04it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn3]/8875, type = batch_norm] ################8 | 61/181 [00:00<00:00, 4845.23it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv1]/input.103, type = _convolutio#################1 | 62/181 [00:00<00:00, 4835.02it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn1]/input.105, type = batch_no#################4 | 63/181 [00:00<00:00, 4834.36it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/ReLU[relu1]/input.107, type = relu_] #################6 | 64/181 [00:00<00:00, 4862.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv2]/input.109, type = _convolutio#################9 | 65/181 [00:00<00:00, 4850.72it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn2]/input.111, type = batch_no##################2 | 66/181 [00:00<00:00, 4850.01it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/ReLU[relu2]/input.113, type = relu_] ##################5 | 67/181 [00:00<00:00, 4879.30it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv3]/input.115, type = _convolutio##################7 | 68/181 [00:00<00:00, 4870.02it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn3]/8955, type = batch_norm] ###################6 | 71/181 [00:00<00:00, 4919.31it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv1]/input.121, type = _convolutio###################8 | 72/181 [00:00<00:00, 4912.40it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn1]/input.123, type = batch_no####################1 | 73/181 [00:00<00:00, 4897.46it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/ReLU[relu1]/input.125, type = relu_] ####################4 | 74/181 [00:00<00:00, 4919.46it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv2]/input.127, type = _convolutio####################7 | 75/181 [00:00<00:00, 4909.83it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn2]/input.129, type = batch_no####################9 | 76/181 [00:00<00:00, 4906.98it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/ReLU[relu2]/input.131, type = relu_] #####################2 | 77/181 [00:00<00:00, 4931.69it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv3]/input.133, type = _convolutio#####################5 | 78/181 [00:00<00:00, 4922.08it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn3]/9035, type = batch_norm] ######################3 | 81/181 [00:00<00:00, 4963.09it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv1]/input.139, type = _convolutio######################6 | 82/181 [00:00<00:00, 4953.38it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn1]/input.141, type = batch_no######################9 | 83/181 [00:00<00:00, 4949.84it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/ReLU[relu1]/input.143, type = relu_] #######################2 | 84/181 [00:00<00:00, 4967.10it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv2]/input.145, type = _convolutio#######################4 | 85/181 [00:00<00:00, 4948.93it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn2]/input.147, type = batch_no#######################7 | 86/181 [00:00<00:00, 4931.84it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/ReLU[relu2]/input.149, type = relu_] ######################## | 87/181 [00:00<00:00, 4950.41it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv3]/input.151, type = _convolutio########################3 | 88/181 [00:00<00:00, 4941.08it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn3]/9115, type = batch_norm] ########################5 | 89/181 [00:00<00:00, 4934.41it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.153,########################8 | 90/181 [00:00<00:00, 4922.19it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/9140,#########################1 | 91/181 [00:00<00:00, 4915.22it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Add[add]/input.155, type = add] #########################6 | 93/181 [00:00<00:00, 4949.50it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv1]/input.159, type = _convolutio#########################9 | 94/181 [00:00<00:00, 4941.65it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn1]/input.161, type = batch_no##########################2 | 95/181 [00:00<00:00, 4938.14it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/ReLU[relu1]/input.163, type = relu_] ##########################5 | 96/181 [00:00<00:00, 4957.01it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv2]/input.165, type = _convolutio##########################7 | 97/181 [00:00<00:00, 4947.44it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn2]/input.167, type = batch_no########################### | 98/181 [00:00<00:00, 4933.76it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/ReLU[relu2]/input.169, type = relu_] ###########################3 | 99/181 [00:00<00:00, 4950.00it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv3]/input.171, type = _convolutio###########################6 | 100/181 [00:00<00:00, 4938.02it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn3]/9220, type = batch_norm] ############################4 | 103/181 [00:00<00:00, 4953.32it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv1]/input.177, type = _convoluti############################7 | 104/181 [00:00<00:00, 4941.74it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn1]/input.179, type = batch_n############################# | 105/181 [00:00<00:00, 4933.31it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/ReLU[relu1]/input.181, type = relu_] #############################2 | 106/181 [00:00<00:00, 4948.15it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv2]/input.183, type = _convoluti#############################5 | 107/181 [00:00<00:00, 4937.90it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn2]/input.185, type = batch_n#############################8 | 108/181 [00:00<00:00, 4931.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/ReLU[relu2]/input.187, type = relu_] ##############################1 | 109/181 [00:00<00:00, 4946.06it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv3]/input.189, type = _convoluti##############################3 | 110/181 [00:00<00:00, 4934.95it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn3]/9300, type = batch_norm] ###############################2 | 113/181 [00:00<00:00, 4941.42it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv1]/input.195, type = _convoluti###############################4 | 114/181 [00:00<00:00, 4930.05it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn1]/input.197, type = batch_n###############################7 | 115/181 [00:00<00:00, 4922.79it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/ReLU[relu1]/input.199, type = relu_] ################################ | 116/181 [00:00<00:00, 4935.93it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv2]/input.201, type = _convoluti################################3 | 117/181 [00:00<00:00, 4926.25it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn2]/input.203, type = batch_n################################5 | 118/181 [00:00<00:00, 4919.08it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/ReLU[relu2]/input.205, type = relu_] ################################8 | 119/181 [00:00<00:00, 4931.65it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv3]/input.207, type = _convoluti#################################1 | 120/181 [00:00<00:00, 4922.84it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn3]/9380, type = batch_norm] #################################9 | 123/181 [00:00<00:00, 4939.86it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv1]/input.213, type = _convoluti##################################2 | 124/181 [00:00<00:00, 4918.61it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn1]/input.215, type = batch_n##################################5 | 125/181 [00:00<00:00, 4913.07it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/ReLU[relu1]/input.217, type = relu_] ##################################8 | 126/181 [00:00<00:00, 4924.22it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv2]/input.219, type = _convoluti################################### | 127/181 [00:00<00:00, 4917.67it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn2]/input.221, type = batch_n###################################3 | 128/181 [00:00<00:00, 4914.29it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/ReLU[relu2]/input.223, type = relu_] ###################################6 | 129/181 [00:00<00:00, 4927.64it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv3]/input.225, type = _convoluti###################################9 | 130/181 [00:00<00:00, 4919.03it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn3]/9460, type = batch_norm] ####################################7 | 133/181 [00:00<00:00, 4936.88it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv1]/input.231, type = _convoluti##################################### | 134/181 [00:00<00:00, 4929.28it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn1]/input.233, type = batch_n#####################################2 | 135/181 [00:00<00:00, 4925.59it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/ReLU[relu1]/input.235, type = relu_] #####################################5 | 136/181 [00:00<00:00, 4937.85it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv2]/input.237, type = _convoluti#####################################8 | 137/181 [00:00<00:00, 4923.78it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn2]/input.239, type = batch_n######################################1 | 138/181 [00:00<00:00, 4918.17it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/ReLU[relu2]/input.241, type = relu_] ######################################3 | 139/181 [00:00<00:00, 4929.64it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv3]/input.243, type = _convoluti######################################6 | 140/181 [00:00<00:00, 4919.51it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn3]/9540, type = batch_norm] #######################################5 | 143/181 [00:00<00:00, 4937.36it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv1]/input.249, type = _convoluti#######################################7 | 144/181 [00:00<00:00, 4931.25it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn1]/input.251, type = batch_n######################################## | 145/181 [00:00<00:00, 4926.88it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/ReLU[relu1]/input.253, type = relu_] ########################################3 | 146/181 [00:00<00:00, 4936.78it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv2]/input.255, type = _convoluti########################################6 | 147/181 [00:00<00:00, 4929.82it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn2]/input.257, type = batch_n########################################8 | 148/181 [00:00<00:00, 4926.33it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/ReLU[relu2]/input.259, type = relu_] #########################################1 | 149/181 [00:00<00:00, 4937.52it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv3]/input.261, type = _convoluti#########################################4 | 150/181 [00:00<00:00, 4838.76it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn3]/9620, type = batch_norm] #########################################7 | 151/181 [00:00<00:00, 4829.79it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.263#########################################9 | 152/181 [00:00<00:00, 4822.79it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/9645##########################################2 | 153/181 [00:00<00:00, 4819.16it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Add[add]/input.265, type = add] ##########################################8 | 155/181 [00:00<00:00, 4840.21it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv1]/input.269, type = _convoluti########################################### | 156/181 [00:00<00:00, 4835.50it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn1]/input.271, type = batch_n###########################################3 | 157/181 [00:00<00:00, 4830.16it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/ReLU[relu1]/input.273, type = relu_] ###########################################6 | 158/181 [00:00<00:00, 4839.70it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv2]/input.275, type = _convoluti###########################################9 | 159/181 [00:00<00:00, 4834.53it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn2]/input.277, type = batch_n############################################1 | 160/181 [00:00<00:00, 4831.49it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/ReLU[relu2]/input.279, type = relu_] ############################################4 | 161/181 [00:00<00:00, 4842.16it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv3]/input.281, type = _convoluti############################################7 | 162/181 [00:00<00:00, 4829.64it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn3]/9725, type = batch_norm] #############################################5 | 165/181 [00:00<00:00, 4843.85it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv1]/input.287, type = _convoluti#############################################8 | 166/181 [00:00<00:00, 4839.30it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn1]/input.289, type = batch_n##############################################1 | 167/181 [00:00<00:00, 4836.28it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/ReLU[relu1]/input.291, type = relu_] ##############################################4 | 168/181 [00:00<00:00, 4845.34it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv2]/input.293, type = _convoluti##############################################6 | 169/181 [00:00<00:00, 4840.17it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn2]/input.295, type = batch_n##############################################9 | 170/181 [00:00<00:00, 4836.01it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/ReLU[relu2]/input.297, type = relu_] ###############################################2 | 171/181 [00:00<00:00, 4845.60it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv3]/input.299, type = _convoluti###############################################5 | 172/181 [00:00<00:00, 4841.65it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn3]/9805, type = batch_norm] ##################################################| 181/181 [00:00<00:00, 4745.64it/s, OpInfo: name = return_0, type = Return] [VAIQ_ERROR]: Unsupported Ops: {'_record_function_enter'} Traceback (most recent call last): File "playcls/quantize/QAT.py", line 101, in args=(cfg, args), File "./playcls/core/launch.py", line 95, in launch start_method=start_method, File "/opt/vitis_ai/conda/envs/vitis-ai-pytorch/lib/python3.7/site-packages/torch/multiprocessing/spawn.py", line 188, in start_processes while not context.join(): File "/opt/vitis_ai/conda/envs/vitis-ai-pytorch/lib/python3.7/site-packages/torch/multiprocessing/spawn.py", line 144, in join exit_code=exitcode torch.multiprocessing.spawn.ProcessExitedException: process 1 terminated with exit code 1 [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... 2022-11-07 02:44:29.397 | INFO | playcls.core.launch:_distributed_worker:116 - Rank 0 initialization finished. [VAIQ_NOTE]: Loading NNDCT kernels... 2022-11-07 02:44:29.447 | INFO | playcls.core.launch:_distributed_worker:116 - Rank 1 initialization finished. /workspace/storagex-cls-v0.1.0/playcls/quantize/QAT.py:74: UserWarning: You have chosen to seed training. This will turn on the CUDNN deterministic setting, which can slow down your training considerably! You may see unexpected behavior when restarting from checkpoints. "You have chosen to seed training. This will turn on the CUDNN deterministic setting, " 2022-11-07 02:44:32.618 | INFO | playcls.utils.setup_env:configure_omp:46 - *************************************************************** We set `OMP_NUM_THREADS` for each process to 1 to speed up. please further tune the variable for optimal performance. *************************************************************** 2022-11-07 02:44:32 | INFO | playcls.core.qat_trainer:156 - args: Namespace(batch_size=64, ckpt=None, devices=2, dist_backend='nccl', dist_url=None, machine_rank=0, num_machines=1, opts=[], quant_mode='test', resume=False, start_epoch=None, yml='cfg/resnet/zheta_cls3_resnet50_224_aug_qat.yml') 2022-11-07 02:44:32 | INFO | playcls.core.qat_trainer:157 - exp value: DATALOADER: NUM_WORKERS: 4 DATASETS: NAME: ImageNet NUM_CLASSES: 3 PATH: /workspace/data/Silan_FRD_22_09_28/3cls_10v0 EVAL_INTERVAL: 1 INPUT: AFFINE: ENABLED: False AUGMIX: ENABLED: False PROB: 0.1 AUTOAUG: ENABLED: False TYPE: imagenet CJ: BRIGHTNESS: 0.15 CONTRAST: 0.15 ENABLED: False HUE: 0.1 PROB: 0.5 SATURATION: 0.1 CROP: ENABLED: False RATIO: [0.75, 1.3333333333333333] SCALE: [0.16, 1] SIZE: [224, 224] HORIZON_FLIP: ENABLED: True PROB: 0.5 IN_CHANS: 3 PADDING: ENABLED: False MODE: constant SIZE: 10 REA: ENABLED: False PROB: 0.5 VALUE: [0.0, 0.0, 0.0] RPT: ENABLED: False PROB: 0.5 SIZE_TEST: [224, 224] SIZE_TRAIN: [224, 224] VERTICAL_FLIP: ENABLED: True PROB: 0.5 LOSS: LABEL_SMOOTH: True NAME: ce SMOOTH_EPS: 0.1 MODEL: AUX_LOGITS: True BN_EPS: None BN_MOMENTUM: None CHANNELS_LAST: False DEPTH: 50x DEVICE: cuda DROP: 0.0 DROP_BLOCK: None DROP_PATH: None EMA: True FEAT_DIM: 2048 GRAG_CKPT: False INCEPTION_LOSS_WEIGHTS: (1.0, 0.4) INITIAL_CKPT: LAST_STRIDE: 2 NAME: resnet NORM: BN OWN: True PIXEL_MEAN: [0.485, 0.465, 0.406] PIXEL_STD: [0.229, 0.224, 0.225] POOLING: None PRETRAIN: True PRETRAIN_PATH: /workspace/storagex-cls-v0.1.0/logs/resnet/zheta_cls3_res50_aug_224/best_ckpt.pth SYNC_BN: False TORCHSCRIPT: False TRANSFORM_INPUT: False OUTPUT_DIR: logs/ PRINT_INTERVAL: 10 SAVE_HISTORY_CKPT: False SEED: 42 SOLVER: ALPHA: 0.99 AMSGRAD: True BASE_LR_PER_IMAGE: 0.001 BETAS: (0.9, 0.999) EPS: 1e-07 GAMMA: 0.1 LAYER_DECAY: None MAX_EPOCH: 30 MILESTONES: [30, 60, 90] MOMENTUM: 0.9 NESTEROV: True OPT: SGD WARMUP_EPOCHS: 5 WARMUP_LR: 0 WARMUP_TYPE: warmmultistep WEIGHT_DECAY: 0.0001 TRICKS: DROP: 0.0 DROP_BLOCK: None DROP_PATH: None 2022-11-07 02:44:33 | INFO | playcls.quant_model.resnet:271 - Loading pretrained model from /workspace/storagex-cls-v0.1.0/logs/resnet/zheta_cls3_res50_aug_224/best_ckpt.pth 2022-11-07 02:44:33 | INFO | playcls.core.qat_trainer:179 - init prefetcher, this might take one minute or less... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Loading NNDCT kernels... [VAIQ_NOTE]: Quant config file is empty, use default quant configuration [VAIQ_NOTE]: Quantization calibration process start up... [VAIQ_NOTE]: =>Quant Module is in 'cuda'. [VAIQ_NOTE]: Quant config file is empty, use default quant configuration [VAIQ_NOTE]: Quantization calibration process start up... [VAIQ_NOTE]: =>Quant Module is in 'cuda'. [VAIQ_NOTE]: =>Parsing DistributedDataParallel... [VAIQ_NOTE]: =>Parsing DistributedDataParallel... [VAIQ_NOTE]: Start to trace model... [VAIQ_NOTE]: Start to trace model... [VAIQ_NOTE]: Finish tracing. [VAIQ_NOTE]: Finish tracing. [VAIQ_NOTE]: Processing ops... ##2 | 8/181 [00:00<00:00, 3655.96it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn1]/input.11, type = batch_norm##4 | 9/181 [00:00<00:00, 3745.66it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/ReLU[relu1]/input.13, type = relu_] ##7 | 10/181 [00:00<00:00, 3953.53it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Conv2d[conv2]/input.15, type = _convolution### | 11/181 [00:00<00:00, 3429.27it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn2]/input.17, type = batch_nor###3 | 12/181 [00:00<00:00, 3504.75it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/ReLU[relu2]/input.19, type = relu_] ###5 | 13/181 [00:00<00:00, 3657.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Conv2d[conv3]/input.21, type = _convolution###8 | 14/181 [00:00<00:00, 3588.60it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn3]/8505, type = batch_norm] ####1 | 15/181 [00:00<00:00, 3583.85it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.23, ####4 | 16/181 [00:00<00:00, 3604.90it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/8530,####6 | 17/181 [00:00<00:00, 3650.21it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Add[add]/input.25, type = add] #####2 | 19/181 [00:00<00:00, 3636.90it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv1]/input.29, type = _convolution#####5 | 20/181 [00:00<00:00, 3657.88it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn1]/input.31, type = batch_nor#####8 | 21/181 [00:00<00:00, 3690.78it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/ReLU[relu1]/input.33, type = relu_] ###### | 22/181 [00:00<00:00, 3780.05it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv2]/input.35, type = _convolution######3 | 23/181 [00:00<00:00, 3756.43it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn2]/input.37, type = batch_nor######6 | 24/181 [00:00<00:00, 3786.18it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/ReLU[relu2]/input.39, type = relu_] ######9 | 25/181 [00:00<00:00, 3865.43it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv3]/input.41, type = _convolution#######1 | 26/181 [00:00<00:00, 3880.02it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn3]/8610, type = batch_norm] ######## | 29/181 [00:00<00:00, 4030.85it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv1]/input.47, type = _convolution########2 | 30/181 [00:00<00:00, 4041.40it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn1]/input.49, type = batch_nor########5 | 31/181 [00:00<00:00, 4061.46it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/ReLU[relu1]/input.51, type = relu_] ########8 | 32/181 [00:00<00:00, 4124.45it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv2]/input.53, type = _convolution] [VAIQ_NOTE]: Processing ops... #########1 | 33/181 [00:00<00:00, 4130.35it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn2]/input.55, type = batch_nor#########3 | 34/181 [00:00<00:00, 4146.02it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/ReLU[relu2]/input.57, type = relu_] #########6 | 35/181 [00:00<00:00, 4179.14it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv3]/input.59, type = _convolution#########9 | 36/181 [00:00<00:00, 4175.98it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn3]/8690, type = batch_norm] ##########7 | 39/181 [00:00<00:00, 4284.83it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv1]/input.65, type = _convolution########### | 40/181 [00:00<00:00, 4284.05it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn1]/input.67, type = batch_nor###########3 | 41/181 [00:00<00:00, 4291.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/ReLU[relu1]/input.69, type = relu_] ###########6 | 42/181 [00:00<00:00, 4335.62it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv2]/input.71, type = _convolution###########8 | 43/181 [00:00<00:00, 4330.46it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn2]/input.73, type = batch_nor############1 | 44/181 [00:00<00:00, 4320.28it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/ReLU[relu2]/input.75, type = relu_] ############4 | 45/181 [00:00<00:00, 4343.03it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv3]/input.77, type = _convolution############7 | 46/181 [00:00<00:00, 4337.93it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn3]/8770, type = batch_norm] ############9 | 47/181 [00:00<00:00, 4328.58it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.79, #############2 | 48/181 [00:00<00:00, 4321.43it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/8795,#############5 | 49/181 [00:00<00:00, 4324.57it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Add[add]/input.81, type = add] ############## | 51/181 [00:00<00:00, 4396.27it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv1]/input.85, type = _convolution##############3 | 52/181 [00:00<00:00, 4394.24it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn1]/input.87, type = batch_nor##############6 | 53/181 [00:00<00:00, 4398.72it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/ReLU[relu1]/input.89, type = relu_] ##############9 | 54/181 [00:00<00:00, 4435.11it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv2]/input.91, type = _convolution###############1 | 55/181 [00:00<00:00, 4432.96it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn2]/input.93, type = batch_nor###############4 | 56/181 [00:00<00:00, 4438.08it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/ReLU[relu2]/input.95, type = relu_] ###############7 | 57/181 [00:00<00:00, 4470.62it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv3]/input.97, type = _convolution################ | 58/181 [00:00<00:00, 4463.42it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn3]/8875, type = batch_norm] ################8 | 61/181 [00:00<00:00, 4488.33it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv1]/input.103, type = _convolutio#################1 | 62/181 [00:00<00:00, 4480.79it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn1]/input.105, type = batch_no#################4 | 63/181 [00:00<00:00, 4472.45it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/ReLU[relu1]/input.107, type = relu_] #################6 | 64/181 [00:00<00:00, 4487.84it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv2]/input.109, type = _convolutio#################9 | 65/181 [00:00<00:00, 4465.90it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn2]/input.111, type = batch_no##################2 | 66/181 [00:00<00:00, 4461.02it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/ReLU[relu2]/input.113, type = relu_] ##################5 | 67/181 [00:00<00:00, 4483.52it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv3]/input.115, type = _convolutio##2 | 8/181 [00:00<00:00, 3375.70it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn1]/input.11, type = batch_norm##4 | 9/181 [00:00<00:00, 3419.27it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/ReLU[relu1]/input.13, type = relu_] ##################7 | 68/181 [00:00<00:00, 4466.43it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn3]/8955, type = batch_norm] ##7 | 10/181 [00:00<00:00, 3611.73it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Conv2d[conv2]/input.15, type = _convolution### | 11/181 [00:00<00:00, 3597.73it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn2]/input.17, type = batch_nor###################6 | 71/181 [00:00<00:00, 4496.45it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv1]/input.121, type = _convolutio###3 | 12/181 [00:00<00:00, 3676.80it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/ReLU[relu2]/input.19, type = relu_] ###5 | 13/181 [00:00<00:00, 3839.32it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Conv2d[conv3]/input.21, type = _convolution###################8 | 72/181 [00:00<00:00, 4486.09it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn1]/input.123, type = batch_no###8 | 14/181 [00:00<00:00, 3848.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/BatchNorm2d[bn3]/8505, type = batch_norm] ####################1 | 73/181 [00:00<00:00, 4467.23it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/ReLU[relu1]/input.125, type = relu_] ####1 | 15/181 [00:00<00:00, 3871.90it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.23, ####################4 | 74/181 [00:00<00:00, 4478.96it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv2]/input.127, type = _convolutio####4 | 16/181 [00:00<00:00, 3880.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/8530,####################7 | 75/181 [00:00<00:00, 4463.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn2]/input.129, type = batch_no####6 | 17/181 [00:00<00:00, 3887.43it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[0]/Add[add]/input.25, type = add] ####################9 | 76/181 [00:00<00:00, 4456.22it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/ReLU[relu2]/input.131, type = relu_] #####################2 | 77/181 [00:00<00:00, 4474.58it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv3]/input.133, type = _convolutio#####################5 | 78/181 [00:00<00:00, 4469.10it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn3]/9035, type = batch_norm] #####2 | 19/181 [00:00<00:00, 3872.67it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv1]/input.29, type = _convolution#####5 | 20/181 [00:00<00:00, 3890.10it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn1]/input.31, type = batch_nor######################3 | 81/181 [00:00<00:00, 4510.48it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv1]/input.139, type = _convolutio#####8 | 21/181 [00:00<00:00, 3909.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/ReLU[relu1]/input.33, type = relu_] ###### | 22/181 [00:00<00:00, 3998.04it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv2]/input.35, type = _convolution######################6 | 82/181 [00:00<00:00, 4506.64it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn1]/input.141, type = batch_no######3 | 23/181 [00:00<00:00, 3977.77it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn2]/input.37, type = batch_nor######################9 | 83/181 [00:00<00:00, 4507.43it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/ReLU[relu1]/input.143, type = relu_] #######################2 | 84/181 [00:00<00:00, 4529.84it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv2]/input.145, type = _convolutio######6 | 24/181 [00:00<00:00, 4007.14it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/ReLU[relu2]/input.39, type = relu_] ######9 | 25/181 [00:00<00:00, 4091.68it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/Conv2d[conv3]/input.41, type = _convolution#######################4 | 85/181 [00:00<00:00, 4525.00it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn2]/input.147, type = batch_no#######1 | 26/181 [00:00<00:00, 4102.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[1]/BatchNorm2d[bn3]/8610, type = batch_norm] #######################7 | 86/181 [00:00<00:00, 4516.22it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/ReLU[relu2]/input.149, type = relu_] ######################## | 87/181 [00:00<00:00, 4535.28it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv3]/input.151, type = _convolutio######## | 29/181 [00:00<00:00, 4271.04it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv1]/input.47, type = _convolution########################3 | 88/181 [00:00<00:00, 4531.82it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn3]/9115, type = batch_norm] ########2 | 30/181 [00:00<00:00, 4280.34it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn1]/input.49, type = batch_nor########################5 | 89/181 [00:00<00:00, 4532.07it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.153,########5 | 31/181 [00:00<00:00, 4297.58it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/ReLU[relu1]/input.51, type = relu_] ########################8 | 90/181 [00:00<00:00, 4527.31it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/9140,########8 | 32/181 [00:00<00:00, 4363.53it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv2]/input.53, type = _convolution#########################1 | 91/181 [00:00<00:00, 4526.75it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Add[add]/input.155, type = add] #########1 | 33/181 [00:00<00:00, 4367.69it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn2]/input.55, type = batch_nor#########3 | 34/181 [00:00<00:00, 4382.09it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/ReLU[relu2]/input.57, type = relu_] #########################6 | 93/181 [00:00<00:00, 4563.61it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv1]/input.159, type = _convolutio#########6 | 35/181 [00:00<00:00, 4419.58it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/Conv2d[conv3]/input.59, type = _convolution#########################9 | 94/181 [00:00<00:00, 4558.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn1]/input.161, type = batch_no#########9 | 36/181 [00:00<00:00, 4409.13it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer1]/Bottleneck[2]/BatchNorm2d[bn3]/8690, type = batch_norm] ##########################2 | 95/181 [00:00<00:00, 4557.88it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/ReLU[relu1]/input.163, type = relu_] ##########################5 | 96/181 [00:00<00:00, 4574.15it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv2]/input.165, type = _convolutio##########################7 | 97/181 [00:00<00:00, 4569.52it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn2]/input.167, type = batch_no##########7 | 39/181 [00:00<00:00, 4500.33it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv1]/input.65, type = _convolution########################### | 98/181 [00:00<00:00, 4561.45it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/ReLU[relu2]/input.169, type = relu_] ########### | 40/181 [00:00<00:00, 4495.50it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn1]/input.67, type = batch_nor###########################3 | 99/181 [00:00<00:00, 4578.23it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv3]/input.171, type = _convolutio###########3 | 41/181 [00:00<00:00, 4498.21it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/ReLU[relu1]/input.69, type = relu_] ###########6 | 42/181 [00:00<00:00, 4539.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv2]/input.71, type = _convolution###########################6 | 100/181 [00:00<00:00, 4573.59it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn3]/9220, type = batch_norm] ###########8 | 43/181 [00:00<00:00, 4524.37it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn2]/input.73, type = batch_nor############1 | 44/181 [00:00<00:00, 4522.38it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/ReLU[relu2]/input.75, type = relu_] ############################4 | 103/181 [00:00<00:00, 4602.84it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv1]/input.177, type = _convoluti############4 | 45/181 [00:00<00:00, 4564.21it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Conv2d[conv3]/input.77, type = _convolution############################7 | 104/181 [00:00<00:00, 4599.16it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn1]/input.179, type = batch_n############7 | 46/181 [00:00<00:00, 4553.97it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/BatchNorm2d[bn3]/8770, type = batch_norm] ############################# | 105/181 [00:00<00:00, 4598.83it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/ReLU[relu1]/input.181, type = relu_] #############################2 | 106/181 [00:00<00:00, 4615.82it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv2]/input.183, type = _convoluti############9 | 47/181 [00:00<00:00, 4537.41it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.79, #############################5 | 107/181 [00:00<00:00, 4612.20it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn2]/input.185, type = batch_n#############2 | 48/181 [00:00<00:00, 4519.93it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/8795,#############################8 | 108/181 [00:00<00:00, 4611.05it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/ReLU[relu2]/input.187, type = relu_] #############5 | 49/181 [00:00<00:00, 4518.53it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[0]/Add[add]/input.81, type = add] ##############################1 | 109/181 [00:00<00:00, 4627.32it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv3]/input.189, type = _convoluti############## | 51/181 [00:00<00:00, 4583.05it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv1]/input.85, type = _convolution##############################3 | 110/181 [00:00<00:00, 4624.37it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn3]/9300, type = batch_norm] ##############3 | 52/181 [00:00<00:00, 4573.94it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn1]/input.87, type = batch_nor##############6 | 53/181 [00:00<00:00, 4573.28it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/ReLU[relu1]/input.89, type = relu_] ##############9 | 54/181 [00:00<00:00, 4606.50it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv2]/input.91, type = _convolution###############################2 | 113/181 [00:00<00:00, 4644.13it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv1]/input.195, type = _convoluti###############1 | 55/181 [00:00<00:00, 4598.38it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn2]/input.93, type = batch_nor###############################4 | 114/181 [00:00<00:00, 4640.39it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn1]/input.197, type = batch_n###############4 | 56/181 [00:00<00:00, 4598.39it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/ReLU[relu2]/input.95, type = relu_] ###############################7 | 115/181 [00:00<00:00, 4639.72it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/ReLU[relu1]/input.199, type = relu_] ###############7 | 57/181 [00:00<00:00, 4632.08it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/Conv2d[conv3]/input.97, type = _convolution################################ | 116/181 [00:00<00:00, 4654.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv2]/input.201, type = _convoluti################ | 58/181 [00:00<00:00, 4622.96it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[1]/BatchNorm2d[bn3]/8875, type = batch_norm] ################################3 | 117/181 [00:00<00:00, 4651.11it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn2]/input.203, type = batch_n################################5 | 118/181 [00:00<00:00, 4649.92it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/ReLU[relu2]/input.205, type = relu_] ################################8 | 119/181 [00:00<00:00, 4664.17it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv3]/input.207, type = _convoluti################8 | 61/181 [00:00<00:00, 4667.22it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv1]/input.103, type = _convolutio#################################1 | 120/181 [00:00<00:00, 4660.64it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn3]/9380, type = batch_norm] #################1 | 62/181 [00:00<00:00, 4658.83it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn1]/input.105, type = batch_no#################4 | 63/181 [00:00<00:00, 4657.13it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/ReLU[relu1]/input.107, type = relu_] #################################9 | 123/181 [00:00<00:00, 4684.88it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv1]/input.213, type = _convoluti#################6 | 64/181 [00:00<00:00, 4676.25it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv2]/input.109, type = _convolutio##################################2 | 124/181 [00:00<00:00, 4674.29it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn1]/input.215, type = batch_n#################9 | 65/181 [00:00<00:00, 4665.92it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn2]/input.111, type = batch_no##################################5 | 125/181 [00:00<00:00, 4671.09it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/ReLU[relu1]/input.217, type = relu_] ##################2 | 66/181 [00:00<00:00, 4639.17it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/ReLU[relu2]/input.113, type = relu_] ##################################8 | 126/181 [00:00<00:00, 4684.75it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv2]/input.219, type = _convoluti##################5 | 67/181 [00:00<00:00, 4644.70it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/Conv2d[conv3]/input.115, type = _convolutio################################### | 127/181 [00:00<00:00, 4680.65it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn2]/input.221, type = batch_n###################################3 | 128/181 [00:00<00:00, 4679.06it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/ReLU[relu2]/input.223, type = relu_] ##################7 | 68/181 [00:00<00:00, 4593.09it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[2]/BatchNorm2d[bn3]/8955, type = batch_norm] ###################################6 | 129/181 [00:00<00:00, 4692.02it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv3]/input.225, type = _convoluti###################################9 | 130/181 [00:00<00:00, 4686.90it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn3]/9460, type = batch_norm] ###################6 | 71/181 [00:00<00:00, 4591.08it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv1]/input.121, type = _convolutio####################################7 | 133/181 [00:00<00:00, 4709.44it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv1]/input.231, type = _convoluti###################8 | 72/181 [00:00<00:00, 4587.35it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn1]/input.123, type = batch_no##################################### | 134/181 [00:00<00:00, 4705.36it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn1]/input.233, type = batch_n####################1 | 73/181 [00:00<00:00, 4577.91it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/ReLU[relu1]/input.125, type = relu_] ####################4 | 74/181 [00:00<00:00, 4600.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv2]/input.127, type = _convolutio#####################################2 | 135/181 [00:00<00:00, 4703.89it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/ReLU[relu1]/input.235, type = relu_] #####################################5 | 136/181 [00:00<00:00, 4716.64it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv2]/input.237, type = _convoluti####################7 | 75/181 [00:00<00:00, 4598.35it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn2]/input.129, type = batch_no#####################################8 | 137/181 [00:00<00:00, 4706.45it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn2]/input.239, type = batch_n####################9 | 76/181 [00:00<00:00, 4599.42it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/ReLU[relu2]/input.131, type = relu_] #####################2 | 77/181 [00:00<00:00, 4625.70it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/Conv2d[conv3]/input.133, type = _convolutio######################################1 | 138/181 [00:00<00:00, 4703.89it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/ReLU[relu2]/input.241, type = relu_] #####################5 | 78/181 [00:00<00:00, 4622.28it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer2]/Bottleneck[3]/BatchNorm2d[bn3]/9035, type = batch_norm] ######################################3 | 139/181 [00:00<00:00, 4715.94it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv3]/input.243, type = _convoluti######################################6 | 140/181 [00:00<00:00, 4711.68it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn3]/9540, type = batch_norm] ######################3 | 81/181 [00:00<00:00, 4666.48it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv1]/input.139, type = _convolutio######################6 | 82/181 [00:00<00:00, 4649.06it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn1]/input.141, type = batch_no#######################################5 | 143/181 [00:00<00:00, 4723.84it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv1]/input.249, type = _convoluti######################9 | 83/181 [00:00<00:00, 4647.83it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/ReLU[relu1]/input.143, type = relu_] #######################################7 | 144/181 [00:00<00:00, 4719.59it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn1]/input.251, type = batch_n#######################2 | 84/181 [00:00<00:00, 4669.97it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv2]/input.145, type = _convolutio######################################## | 145/181 [00:00<00:00, 4717.53it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/ReLU[relu1]/input.253, type = relu_] #######################4 | 85/181 [00:00<00:00, 4664.36it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn2]/input.147, type = batch_no########################################3 | 146/181 [00:00<00:00, 4729.55it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv2]/input.255, type = _convoluti#######################7 | 86/181 [00:00<00:00, 4655.29it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/ReLU[relu2]/input.149, type = relu_] ########################################6 | 147/181 [00:00<00:00, 4725.56it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn2]/input.257, type = batch_n######################## | 87/181 [00:00<00:00, 4675.33it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Conv2d[conv3]/input.151, type = _convolutio########################################8 | 148/181 [00:00<00:00, 4723.75it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/ReLU[relu2]/input.259, type = relu_] ########################3 | 88/181 [00:00<00:00, 4670.84it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/BatchNorm2d[bn3]/9115, type = batch_norm] #########################################1 | 149/181 [00:00<00:00, 4735.52it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv3]/input.261, type = _convoluti########################5 | 89/181 [00:00<00:00, 4671.48it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.153,#########################################4 | 150/181 [00:00<00:00, 4725.34it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn3]/9620, type = batch_norm] ########################8 | 90/181 [00:00<00:00, 4664.71it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/9140,#########################################7 | 151/181 [00:00<00:00, 4722.29it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.263#########################1 | 91/181 [00:00<00:00, 4663.93it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[0]/Add[add]/input.155, type = add] #########################################9 | 152/181 [00:00<00:00, 4717.48it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/9645#########################6 | 93/181 [00:00<00:00, 4702.93it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv1]/input.159, type = _convolutio##########################################2 | 153/181 [00:00<00:00, 4715.64it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Add[add]/input.265, type = add] #########################9 | 94/181 [00:00<00:00, 4699.89it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn1]/input.161, type = batch_no##########################################8 | 155/181 [00:00<00:00, 4736.70it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv1]/input.269, type = _convoluti##########################2 | 95/181 [00:00<00:00, 4699.81it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/ReLU[relu1]/input.163, type = relu_] ##########################5 | 96/181 [00:00<00:00, 4719.00it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv2]/input.165, type = _convolutio########################################### | 156/181 [00:00<00:00, 4733.22it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn1]/input.271, type = batch_n##########################7 | 97/181 [00:00<00:00, 4715.10it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn2]/input.167, type = batch_no###########################################3 | 157/181 [00:00<00:00, 4731.56it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/ReLU[relu1]/input.273, type = relu_] ###########################################6 | 158/181 [00:00<00:00, 4742.48it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv2]/input.275, type = _convoluti########################### | 98/181 [00:00<00:00, 4707.03it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/ReLU[relu2]/input.169, type = relu_] ###########################3 | 99/181 [00:00<00:00, 4724.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/Conv2d[conv3]/input.171, type = _convolutio###########################################9 | 159/181 [00:00<00:00, 4738.52it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn2]/input.277, type = batch_n###########################6 | 100/181 [00:00<00:00, 4719.54it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[1]/BatchNorm2d[bn3]/9220, type = batch_norm] ############################################1 | 160/181 [00:00<00:00, 4736.42it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/ReLU[relu2]/input.279, type = relu_] ############################################4 | 161/181 [00:00<00:00, 4744.42it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv3]/input.281, type = _convoluti############################4 | 103/181 [00:00<00:00, 4750.79it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv1]/input.177, type = _convoluti############################7 | 104/181 [00:00<00:00, 4746.34it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn1]/input.179, type = batch_n############################################7 | 162/181 [00:00<00:00, 4678.53it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn3]/9725, type = batch_norm] ############################# | 105/181 [00:00<00:00, 4745.40it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/ReLU[relu1]/input.181, type = relu_] #############################2 | 106/181 [00:00<00:00, 4762.83it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv2]/input.183, type = _convoluti#############################5 | 107/181 [00:00<00:00, 4759.13it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn2]/input.185, type = batch_n#############################################5 | 165/181 [00:00<00:00, 4687.90it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv1]/input.287, type = _convoluti#############################8 | 108/181 [00:00<00:00, 4758.54it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/ReLU[relu2]/input.187, type = relu_] ##############################1 | 109/181 [00:00<00:00, 4776.01it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/Conv2d[conv3]/input.189, type = _convoluti#############################################8 | 166/181 [00:00<00:00, 4684.48it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn1]/input.289, type = batch_n##############################3 | 110/181 [00:00<00:00, 4771.53it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[2]/BatchNorm2d[bn3]/9300, type = batch_norm] ##############################################1 | 167/181 [00:00<00:00, 4682.90it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/ReLU[relu1]/input.291, type = relu_] ##############################################4 | 168/181 [00:00<00:00, 4692.96it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv2]/input.293, type = _convoluti##############################################6 | 169/181 [00:00<00:00, 4689.47it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn2]/input.295, type = batch_n###############################2 | 113/181 [00:00<00:00, 4793.10it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv1]/input.195, type = _convoluti##############################################9 | 170/181 [00:00<00:00, 4686.87it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/ReLU[relu2]/input.297, type = relu_] ###############################4 | 114/181 [00:00<00:00, 4788.50it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn1]/input.197, type = batch_n###############################################2 | 171/181 [00:00<00:00, 4696.56it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv3]/input.299, type = _convoluti###############################7 | 115/181 [00:00<00:00, 4787.16it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/ReLU[relu1]/input.199, type = relu_] ###############################################5 | 172/181 [00:00<00:00, 4693.42it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn3]/9805, type = batch_norm] ################################ | 116/181 [00:00<00:00, 4803.43it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv2]/input.201, type = _convoluti################################3 | 117/181 [00:00<00:00, 4799.73it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn2]/input.203, type = batch_n################################5 | 118/181 [00:00<00:00, 4798.70it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/ReLU[relu2]/input.205, type = relu_] ################################8 | 119/181 [00:00<00:00, 4814.48it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/Conv2d[conv3]/input.207, type = _convoluti#################################1 | 120/181 [00:00<00:00, 4810.03it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[3]/BatchNorm2d[bn3]/9380, type = batch_norm] #################################9 | 123/181 [00:00<00:00, 4835.05it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv1]/input.213, type = _convoluti##################################2 | 124/181 [00:00<00:00, 4823.23it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn1]/input.215, type = batch_n##################################5 | 125/181 [00:00<00:00, 4816.12it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/ReLU[relu1]/input.217, type = relu_] ##################################8 | 126/181 [00:00<00:00, 4828.75it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv2]/input.219, type = _convoluti################################### | 127/181 [00:00<00:00, 4824.31it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn2]/input.221, type = batch_n###################################3 | 128/181 [00:00<00:00, 4822.94it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/ReLU[relu2]/input.223, type = relu_] ##################################################| 181/181 [00:00<00:00, 4619.81it/s, OpInfo: name = return_0, type = Return] ###################################6 | 129/181 [00:00<00:00, 4837.03it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/Conv2d[conv3]/input.225, type = _convolution] [VAIQ_ERROR]: Unsupported Ops: {'_record_function_enter'} ###################################9 | 130/181 [00:00<00:00, 4832.23it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[4]/BatchNorm2d[bn3]/9460, type = batch_norm] ####################################7 | 133/181 [00:00<00:00, 4856.76it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv1]/input.231, type = _convoluti##################################### | 134/181 [00:00<00:00, 4843.72it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn1]/input.233, type = batch_n#####################################2 | 135/181 [00:00<00:00, 4831.32it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/ReLU[relu1]/input.235, type = relu_] #####################################5 | 136/181 [00:00<00:00, 4842.85it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv2]/input.237, type = _convoluti#####################################8 | 137/181 [00:00<00:00, 4831.29it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn2]/input.239, type = batch_n######################################1 | 138/181 [00:00<00:00, 4827.71it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/ReLU[relu2]/input.241, type = relu_] ######################################3 | 139/181 [00:00<00:00, 4840.13it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/Conv2d[conv3]/input.243, type = _convoluti######################################6 | 140/181 [00:00<00:00, 4834.26it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer3]/Bottleneck[5]/BatchNorm2d[bn3]/9540, type = batch_norm] #######################################5 | 143/181 [00:00<00:00, 4849.89it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv1]/input.249, type = _convoluti#######################################7 | 144/181 [00:00<00:00, 4845.29it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn1]/input.251, type = batch_n######################################## | 145/181 [00:00<00:00, 4842.69it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/ReLU[relu1]/input.253, type = relu_] ########################################3 | 146/181 [00:00<00:00, 4854.29it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv2]/input.255, type = _convoluti########################################6 | 147/181 [00:00<00:00, 4848.37it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn2]/input.257, type = batch_n########################################8 | 148/181 [00:00<00:00, 4846.52it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/ReLU[relu2]/input.259, type = relu_] #########################################1 | 149/181 [00:00<00:00, 4858.48it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Conv2d[conv3]/input.261, type = _convoluti#########################################4 | 150/181 [00:00<00:00, 4785.29it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/BatchNorm2d[bn3]/9620, type = batch_norm] #########################################7 | 151/181 [00:00<00:00, 4778.91it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Sequential[downsample]/Conv2d[0]/input.263#########################################9 | 152/181 [00:00<00:00, 4773.61it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Sequential[downsample]/BatchNorm2d[1]/9645##########################################2 | 153/181 [00:00<00:00, 4771.71it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[0]/Add[add]/input.265, type = add] ##########################################8 | 155/181 [00:00<00:00, 4793.60it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv1]/input.269, type = _convoluti########################################### | 156/181 [00:00<00:00, 4787.91it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn1]/input.271, type = batch_n###########################################3 | 157/181 [00:00<00:00, 4786.56it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/ReLU[relu1]/input.273, type = relu_] ###########################################6 | 158/181 [00:00<00:00, 4795.22it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv2]/input.275, type = _convoluti###########################################9 | 159/181 [00:00<00:00, 4786.99it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn2]/input.277, type = batch_n############################################1 | 160/181 [00:00<00:00, 4782.56it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/ReLU[relu2]/input.279, type = relu_] ############################################4 | 161/181 [00:00<00:00, 4792.57it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/Conv2d[conv3]/input.281, type = _convoluti############################################7 | 162/181 [00:00<00:00, 4781.45it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[1]/BatchNorm2d[bn3]/9725, type = batch_norm] #############################################5 | 165/181 [00:00<00:00, 4795.68it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv1]/input.287, type = _convoluti#############################################8 | 166/181 [00:00<00:00, 4790.95it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn1]/input.289, type = batch_n##############################################1 | 167/181 [00:00<00:00, 4789.36it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/ReLU[relu1]/input.291, type = relu_] ##############################################4 | 168/181 [00:00<00:00, 4799.79it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv2]/input.293, type = _convoluti##############################################6 | 169/181 [00:00<00:00, 4796.34it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn2]/input.295, type = batch_n##############################################9 | 170/181 [00:00<00:00, 4793.52it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/ReLU[relu2]/input.297, type = relu_] ###############################################2 | 171/181 [00:00<00:00, 4803.25it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/Conv2d[conv3]/input.299, type = _convoluti###############################################5 | 172/181 [00:00<00:00, 4798.05it/s, OpInfo: name = DistributedDataParallel/ResNet[module]/Sequential[layer4]/Bottleneck[2]/BatchNorm2d[bn3]/9805, type = batch_norm] ##################################################| 181/181 [00:00<00:00, 4694.23it/s, OpInfo: name = return_0, type = Return] [VAIQ_ERROR]: Unsupported Ops: {'_record_function_enter'} Traceback (most recent call last): File "playcls/quantize/QAT.py", line 101, in args=(cfg, args), File "./playcls/core/launch.py", line 95, in launch start_method=start_method, File "/opt/vitis_ai/conda/envs/vitis-ai-pytorch/lib/python3.7/site-packages/torch/multiprocessing/spawn.py", line 188, in start_processes while not context.join(): File "/opt/vitis_ai/conda/envs/vitis-ai-pytorch/lib/python3.7/site-packages/torch/multiprocessing/spawn.py", line 144, in join exit_code=exitcode torch.multiprocessing.spawn.ProcessExitedException: process 1 terminated with exit code 1