-
Notifications
You must be signed in to change notification settings - Fork 103
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge branch 'main' into feature/txl
- Loading branch information
Showing
149 changed files
with
9,529 additions
and
450 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
2 changes: 2 additions & 0 deletions
2
inference/benchmarks/yolov5/pytorch/kunlunxin_requirements.txt
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
pycocotools | ||
opencv-python-headless |
4 changes: 4 additions & 0 deletions
4
inference/configs/bertLarge/vendor_config/kunlunxin_configurations.yaml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,4 @@ | ||
compiler: xtcl | ||
no_validation: true | ||
vm_enable: false | ||
exist_onnx_path: onnxs/bertLarge/bertLarge_bs32_pytorch_fp16False.onnx |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
6 changes: 6 additions & 0 deletions
6
inference/configs/resnet50/vendor_config/zixiao_configurations.yaml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
compiler: zxrt | ||
no_validation: true | ||
batch_size: 50000 | ||
exist_onnx_path: onnxs/resnet50_pytorch.onnx | ||
repeat: 1 | ||
zixiao_test_batch_size: 32 |
10 changes: 10 additions & 0 deletions
10
inference/configs/sam_h/vendor_config/kunlunxin_configurations.yaml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,10 @@ | ||
compiler: xtcl | ||
no_validation: true | ||
build_config: | ||
FuseWithoutPattern: | ||
- FuseConv2dTransposeBiasAdd | ||
pattern_match: | ||
- fuse_attention_sam | ||
disabled_pass: | ||
- xgraph_layout_opt | ||
exist_onnx_path: onnxs/sam_h_bs4_pytorch_fp16True.onnx |
3 changes: 3 additions & 0 deletions
3
inference/configs/stable_diffusion_v1_4/vendor_config/kunlunxin_configurations.yaml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
fp16: false | ||
compiler: xtcl | ||
no_validation: true |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
16 changes: 16 additions & 0 deletions
16
inference/configs/swinTransformer/vendor_config/kunlunxin_configurations.yaml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,16 @@ | ||
batch_size: 256 | ||
# 1 item(like 1 sequence, 1 image) flops | ||
# Attention! For transformer decoder like bert, 1 token cause 2*param flops, so we need 2*length*params like 2*512*0.33B here | ||
# format: a_1*a*2*...*a_nea_0,like 2*512*0.33e9(bert) or 4.12e9(resnet50) | ||
flops: 1.55e10 | ||
fp16: false | ||
compiler: xtcl | ||
num_workers: 8 | ||
log_freq: 30 | ||
repeat: 5 | ||
# skip validation(will also skip create_model, export onnx). Assert exist_onnx_path != null | ||
no_validation: true | ||
# set a real onnx_path to use exist, or set it to anything but null to avoid export onnx manually(like torch-tensorrt) | ||
exist_onnx_path: /home/liuyu/flagperf/FlagPerf/inference/onnxs/kunlunxin_flagperf_swinTransformer/swinTransformer_bs256_pytorch_fp16False.onnx | ||
# set a exist path of engine file like resnet50.trt/resnet50.plan/resnet50.engine | ||
exist_compiler_path: null |
5 changes: 5 additions & 0 deletions
5
inference/configs/vit_l_16/vendor_config/kunlunxin_configurations.yaml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
compiler: xtcl | ||
# skip validation(will also skip create_model, export onnx). Assert exist_onnx_path != null | ||
no_validation: true | ||
# set a real onnx_path to use exist, or set it to anything but null to avoid export onnx manually(like torch-tensorrt) | ||
exist_onnx_path: /home/FlagPerf/inference/onnxs/vit_l_16_bs32_pytorch_fp16False.onnx |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,23 +1,21 @@ | ||
def analysis_log(logpath): | ||
logfile = open(logpath) | ||
|
||
max_usage = 0.0 ## usage_mem | ||
max_mem = 0.0 | ||
for line in logfile.readlines(): | ||
''' | ||
xpu_smi temp power mem w_mem use_rate | ||
''' | ||
if "xpu_smi" in line: | ||
line = line[:-1] | ||
usage = line.split(" ")[4] | ||
usage = float(usage) | ||
max_usage = max(max_usage, usage) | ||
max_mem = line.split(" ")[5] | ||
max_mem = float(max_mem) | ||
|
||
return round(max_usage / 1024.0, | ||
2), round(max_mem / 1024.0, 2), eval("32e12"), eval("128e12") | ||
|
||
|
||
if __name__ == "__main__": | ||
max1, max2, max2,max4 = analysis_log("/home/zhoujiamin01/workspace/zjm_flag/FlagPerf/inference/result/run20230809192313/resnet50:pytorch_1.13/127.0.0.1_noderank0/kunlunxin_monitor.log") | ||
def analysis_log(logpath): | ||
logfile = open(logpath) | ||
|
||
max_usage = 0.0 ## usage_mem | ||
max_mem = 0.0 | ||
for line in logfile.readlines(): | ||
''' | ||
xpu_smi temp power mem w_mem use_rate | ||
''' | ||
if "xpu_smi" in line: | ||
line = line[:-1] | ||
usage = line.split(" ")[4] | ||
usage = float(usage) | ||
max_usage = max(max_usage, usage) | ||
max_mem = line.split(" ")[5] | ||
max_mem = float(max_mem) | ||
|
||
return round(max_usage / 1024.0, | ||
2), round(max_mem / 1024.0, 2), eval("32e12"), eval("128e12") | ||
|
||
|
Oops, something went wrong.