Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix input mutation in branched networks #38

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

mbalesni
Copy link

@mbalesni mbalesni commented Jan 5, 2022

This fixes a bug in networks that have in-place operations (ReLU, ELU, LeakyReLU) right after branching.

Currently, a layer output that is used by several branches can be mutated by an activation function, since they all have inplace=True hardcoded.

Standalone reproduction of the bug:
https://colab.research.google.com/drive/1VBRYou450YNgJdwiDZVRsCfJKUs4FiWh#scrollTo=mKvLPpKz29D

Change: I made the activation functions' inplace property depend on whether the input of the current node has a branching factor higher than one.

Tests: I verified that this passes the tests and does not break conversion of models from download_fixtures.sh.

Example relevant network architecture: Openpilot's supercombo.onnx model:

  • the ELU layer (highlighted in red in bottom left corner) mutates its input layer (Flatten)
  • another branch that starts at that layer, now gets mutated input, leading to incorrect predictions

Example network image

Let me know if there is any way I can improve it!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant