Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CM] Enable inference for ResNet-50 with TVM - PyTorch backend #7

Closed
KJlaccHoeUM9l opened this issue Jan 31, 2023 · 5 comments
Closed
Assignees
Labels
enhancement New feature or request

Comments

@KJlaccHoeUM9l
Copy link

KJlaccHoeUM9l commented Jan 31, 2023

Based on this tutorial for TVM-ONNX (MLPerf inference - Python - ResNet50 FP32 - ImageNet - TVM - CPU - Offline), you need to add the ability to run inference for TVM with PyTorch frontend.

To run this inference, you need to change backend parameter as follows:

--backend=tvm-pytorch

When you select this backend, you will be taken to the corresponding branches here and here.

The result of the work should be a prepared PR for the official CK.

@KJlaccHoeUM9l KJlaccHoeUM9l added the enhancement New feature or request label Jan 31, 2023
@KJlaccHoeUM9l
Copy link
Author

CC @Ailurus1

@SennikovAndrey
Copy link
Collaborator

Tried to use next script for --backend=tvm-pytorch

-quiet          --clean

And after received next error.

Traceback (most recent call last):
  File "/root/CM/repos/mlcommons@ck/cm-mlops/script/get-ml-model-resnet50-tvm/process.py", line 39, in <module>
    pytorch_model = torch.jit.load(model_path)
  File "/usr/local/lib/python3.8/dist-packages/torch/jit/_serialization.py", line 162, in load
    cpp_module = torch._C.import_ir_module(cu, str(f), map_location, _extra_files)
RuntimeError: PytorchStreamReader failed reading zip archive: failed finding central directory

CM error: Portable CM script failed (name = get-ml-model-resnet50-tvm, return code = 256)

I have one new small idea about solution. Will try it during the next time.

@SennikovAndrey
Copy link
Collaborator

Correct code:

cm run script --tags=run,mlperf,inference,generate-run-cmds \
         --adr.python.name=mlperf \
         --adr.python.version_min=3.8 \
         --adr.tvm.tags=_pip-install \
         --submitter="Community" \
         --implementation=python \
         --hw_name=default \
         --model=resnet50 \
         --backend=tvm-pytorch \
         --device=cpu \
         --scenario=Offline \
         --mode=accuracy \
         --test_query_count=5 \
         --quiet \
         --clean

@SennikovAndrey
Copy link
Collaborator

It was the problem appeared from different types of files ('.pt' and 'pth'). I fixed it by using if/else construction and considering the two cases separately. But now I have some problems with some next commands.

Now i trying to fix the following error:

Traceback (most recent call last):
  File "/root/CM/repos/Deelvin@ck/cm-mlops/script/get-ml-model-resnet50-tvm/process.py", line 53, in <module>
    mod, params = relay.frontend.from_pytorch(pytorch_model, shape_list)
  File "/usr/local/lib/python3.8/dist-packages/tvm/relay/frontend/pytorch.py", line 4503, in from_pytorch
    graph = script_module.graph.copy()
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1269, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'ResNet' object has no attribute 'graph'

CM error: Portable CM script failed (name = get-ml-model-resnet50-tvm, return code = 256)

@Ailurus1 Ailurus1 self-assigned this Oct 14, 2023
@Ailurus1
Copy link
Collaborator

Solved here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants