-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CM] Enable inference for ResNet-50
with TVM - PyTorch
backend
#7
Comments
CC @Ailurus1 |
Tried to use next script for --backend=tvm-pytorch
And after received next error.
I have one new small idea about solution. Will try it during the next time. |
Correct code:
|
It was the problem appeared from different types of files ('.pt' and 'pth'). I fixed it by using if/else construction and considering the two cases separately. But now I have some problems with some next commands. Now i trying to fix the following error:
|
Solved here |
Based on this tutorial for TVM-ONNX (MLPerf inference - Python - ResNet50 FP32 - ImageNet - TVM - CPU - Offline), you need to add the ability to run inference for TVM with
PyTorch
frontend.To run this inference, you need to change
backend
parameter as follows:When you select this backend, you will be taken to the corresponding branches here and here.
The result of the work should be a prepared PR for the official
CK
.The text was updated successfully, but these errors were encountered: