You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Also, the In the test console, there are 64 of the following warnings. It seems like there are issues exporting the pytorch frontend to ONNX:
WARNING: projection is not supported for torch version less than 1.8.0! LSTM was constructed without projection!
WARNING: torch.onnx.export does not support conversion LSTM with projection from pytorch! TODO: waiting for the support and correct test after that.
Suggested actions
Try to minimize the size of the model testcases even further.
Consider minimizing the number of calls to torch.onnx.export until we get a nightly CI in place.
The text was updated successfully, but these errors were encountered:
Warnings are a part of the code in TVM. The first one is related to that LSTM with projection supported by pytorch starting from 1.8.0 version. But TVM works stably with pytorch 1.7.0. The second warning is related to difference between modifications of LSTM supported in onnx and pytorch. It means that in some cases the conversion from pytorch LSTM to ONNX one fails. But it is external issue, it can not be solved on TVM side. Therefore I did not raise issue ticket.
Context
CI is taking a while (see #8552 and build#1384), and I am tracking short-term fixes to alleviate this.
Summary of the issue
test_lstms.py
was recently added in PR #8447. It takes 20+ minutes to run 8 network-sized tests.The tests uses the following parameters, which don't seem minimal:
Also, the In the test console, there are 64 of the following warnings. It seems like there are issues exporting the pytorch frontend to ONNX:
Suggested actions
torch.onnx.export
until we get a nightly CI in place.The text was updated successfully, but these errors were encountered: