[Bug] [Relay] attribute track_running_stats of InstanceNorm lead to wrong inference results #14926
Labels
needs-triage
PRs or issues that need to be investigated by maintainers to find the right assignees to address it
type: bug
For the layer
InstanceNorm1d
orInstanceNorm3d
, if attributetrack_running_stats
was set asTrue
, TVM will give different inference results with PyTorch.Expected behavior
For the same input data, TVM and PyTorch give the same inference results.
Actual behavior
Steps to reproduce
Triage
cc @shingjan
The text was updated successfully, but these errors were encountered: