Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

[MXNET-886] ONNX export: HardSigmoid, Less, Greater, Equal #12812

Merged
merged 2 commits into from
Nov 29, 2018

Conversation

vandanavk
Copy link
Contributor

@vandanavk vandanavk commented Oct 12, 2018

Description

Added support for exporting HardSigmoid, Less, Greater, Equal to ONNX.

v2: Addressed @anirudhacharya's comments on the tests for comparison operators.

Checklist

Essentials

Please feel free to remove inapplicable items for your PR.

  • The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to the relevant JIRA issue created (except PRs with tiny changes)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage:
  • Unit tests are added for small changes to verify correctness (e.g. adding a new operator)
  • Nightly tests are added for complicated/long-running ones (e.g. changing distributed kvstore)
  • Build tests will be added for build configuration changes (e.g. adding a new build option with NCCL)
  • Code is well-documented:
  • For user-facing API changes, API doc string has been updated.
  • For new C++ functions in header files, their functionalities and arguments are documented.
  • For new examples, README.md is added to explain the what the example does, the source of the dataset, expected performance on test set and reference to the original paper if applicable
  • Check the API doc at http://mxnet-ci-doc.s3-accelerate.dualstack.amazonaws.com/PR-$PR_ID/$BUILD_ID/index.html
  • To the my best knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change

Changes

  • Export the 4 operators in _op_translations.py
  • Added tests for the 4 operators

Comments

@vandanavk vandanavk requested a review from szha as a code owner October 12, 2018 16:10
@vandanavk vandanavk changed the title ONNX export: HardSigmoid, Less, Greater, Equal [MXNET-886] ONNX export: HardSigmoid, Less, Greater, Equal Oct 12, 2018
@vandanavk vandanavk changed the title [MXNET-886] ONNX export: HardSigmoid, Less, Greater, Equal [MXNET-886] ONNX export: HardSigmoid, Less, Greater, Equal, InstanceNorm Oct 12, 2018
@vandanavk vandanavk force-pushed the onnx_op branch 2 times, most recently from d779756 to 9d40dff Compare October 12, 2018 18:45
@sandeep-krishnamurthy sandeep-krishnamurthy added ONNX pr-awaiting-review PR is waiting for code review labels Oct 12, 2018
@sandeep-krishnamurthy
Copy link
Contributor

@Roshrini @anirudhacharya


if __name__ == '__main__':
test_greater()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do not call each of the test methods separately. These tests are written based on pytest framework. You can say unittest.main(). pytest at runtime will collect the tests that need to be run.

"""Test for logical greater in onnx operators."""
input1 = np.array([[ 1, 1, 1], [ 1, 1, 1]]).astype("float32")
input2 = np.array([[ 0], [ 1]]).astype("float32")
op = np.array([[ 1, 1, 1], [ 0, 0, 0]]).astype("float32")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do not hard code this operation. use np.greater. Also can you please name this variable different, like expected_value or numpy_op or something like that.

"""Test for equal in onnx operators."""
input1 = np.array([[ 1, 1, 1], [ 1, 1, 1]]).astype("float32")
input2 = np.array([[ 0], [ 1]]).astype("float32")
op = np.array([[ 0, 0, 0], [ 1, 1, 1]]).astype("float32")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do not hard code this operation. use np.equal. Also can you please name this variable different, like expected_value or numpy_op or something like that.

"""Test for lesser in onnx operators."""
input1 = np.array([[ 1, 1, 1], [ 1, 1, 1]]).astype("float32")
input2 = np.array([[ 0], [ 1]]).astype("float32")
op = np.array([[ 0, 0, 0], [ 0, 0, 0]]).astype("float32")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do not hard code this operation. use np.lesser. Also can you please name this variable different, like expected_value or numpy_op or something like that.

@vandanavk vandanavk force-pushed the onnx_op branch 3 times, most recently from f9ba301 to ae66c1c Compare October 16, 2018 00:01
Copy link
Member

@Roshrini Roshrini left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Vandana for adding these operators. Added few comments.

Also make sure you add this "tests/python-pytest/onnx/export/mxnet_export_test.py" file in CI

python/mxnet/contrib/onnx/mx2onnx/_op_translations.py Outdated Show resolved Hide resolved
python/mxnet/contrib/onnx/mx2onnx/export_onnx.py Outdated Show resolved Hide resolved
@vandanavk
Copy link
Contributor Author

Thanks @Roshrini. Made the changes you suggested and submitted

@Roshrini
Copy link
Member

Thanks for fixing :) LGTM

Copy link
Member

@anirudhacharya anirudhacharya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

tests/python-pytest/onnx/export/mxnet_export_test.py Outdated Show resolved Hide resolved
@vandanavk
Copy link
Contributor Author

@zhreshold for review

test_mod.init_params()
else:
test_mod.set_params(arg_params=arg_params, aux_params=aux_params, allow_missing=True)
# module bind method requires all data to have same batch size,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not use sym.bind whatsoever

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tests for batchnorm and conv2d alone fail due to missing keys. will debug further and update.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@zhreshold Will debug this and submit a separate PR for this commit and InstanceNorm (which depends on this). This will make sure that HardSigmoid and the comparison operators can be reviewed/merged instead of waiting on this. Hope that's ok.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vandanavk making sure only high quality code get merged into master is critical. Having pushing feature request isn't a good reason to bypass this restriction.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed @zhreshold. Opened a WIP PR #12920 to continue debug.

The changes in this PR on HardSigmoid and comparison operators are not impacted by the changes in PR #12920.

mod.bind(for_training=False, data_shapes=data_shapes,
label_shapes=None)
mod.set_params(arg_params=self.arg_params, aux_params=self.aux_params)
# module bind method requires all data to have same batch size,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

duplicate in onnx_export.py

@vandanavk vandanavk changed the title [MXNET-886] ONNX export: HardSigmoid, Less, Greater, Equal, InstanceNorm [MXNET-886] ONNX export: HardSigmoid, Less, Greater, Equal Oct 23, 2018
@vandanavk
Copy link
Contributor Author

@zhreshold Please let me know if there are any other inputs on this PR. If this PR is good to go, I can rebase my other PRs on top of this.

@zhreshold
Copy link
Member

@vandanavk What is the PR that is going to correct the repetitive importing?

@vandanavk
Copy link
Contributor Author

@zhreshold This PR #12878

@zhreshold
Copy link
Member

please rebase to changes in #12878

@vandanavk
Copy link
Contributor Author

Rebased. Ready for review again

@vandanavk vandanavk force-pushed the onnx_op branch 4 times, most recently from 37aa0aa to cb5e291 Compare November 2, 2018 19:36
@ankkhedia
Copy link
Contributor

@zhreshold @sandeep-krishnamurthy @nswamy Could you please help with review/merge?

@@ -238,6 +238,63 @@ def test_square():

npt.assert_almost_equal(result, numpy_op)

@with_seed()
def test_greater():
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am seeing that a lot of repetitions in the test code. Could you please modularize the test code? happy to discuss offline

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nswamy @vandanavk was the concern addressed offline? what's the status of this PR?

@vandanavk vandanavk force-pushed the onnx_op branch 2 times, most recently from 77e0bf1 to 9d440cf Compare November 20, 2018 22:51
@nswamy nswamy added pr-awaiting-merge Review and CI is complete. Ready to Merge and removed pr-awaiting-review PR is waiting for code review labels Nov 20, 2018
@vandanavk vandanavk force-pushed the onnx_op branch 4 times, most recently from 5cc157e to de4c390 Compare November 26, 2018 17:48
@sandeep-krishnamurthy sandeep-krishnamurthy merged commit f876461 into apache:master Nov 29, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
ONNX pr-awaiting-merge Review and CI is complete. Ready to Merge
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants