-
Notifications
You must be signed in to change notification settings - Fork 7.1k
Make RandomHorizontalFlip torchscriptable #2282
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changes LGTM, how common is it the case that the transform would need to be saved in their own file ? Also, what do you mean mean exactly with the custom __str__
implementations?
I guess generally i would say they should only be nn.Modules if they would be nn.Modules without considering TorchScript. not sure how that applies here exactly.
This is a good question. @vreis do you have any insights here? I would expect that most of the models in the future might have a
in which case they wouldn't be needed to be saved in a standalone file. That being said, it seemed easier to add support to torchscript by converting the transform to inherit from Traceback (most recent call last):
File "test/test_transforms_tensor.py", line 33, in test_random_horizontal_flip
scripted_fn = torch.jit.script(f)
File "/Users/fmassa/anaconda3/lib/python3.6/site-packages/torch/jit/__init__.py", line 1306, in script
qualified_name = _qualified_name(obj)
File "/Users/fmassa/anaconda3/lib/python3.6/site-packages/torch/_jit_internal.py", line 687, in _qualified_name
name = obj.__name__
AttributeError: 'RandomHorizontalFlip' object has no attribute '__name__' Note that we do have |
This looks good to me.
We don't have that as a strict requirement, but right now the transforms are tied to the dataset abstraction rather than the model so I think it makes sense to generate a separate file. What would be nice is that the final torchscript model took pure jpegs and did everything needed to it (decoding, transforms, inference). But having separate files doesn't preclude us from doing that. |
* Make RandomVerticalFlip torchscriptable * Fix lint
This is the first PR in a series towards making the torchvision transforms torchscriptable.
The majority of the work has already been done in the
functional_tensor.py
file, which for now hasn't been properly exposed to the user.The objective of this PR is twofold:
functional
andtransforms
seamlessly support Tensor argumentsfunctional
andtransforms
be torchscriptableIn order to minimize code changes, I decided for now to move (one at a time) the Pillow transforms to a separate file, and in
functional.py
dispatch to eitherfunctional_tensor.py
orfunctional_pil.py
.Another thing that might be worth discussion: the torchvision transforms now inherit from
nn.Module
. I think this makes things simpler wrt torchscript, and as such might be preferred over keeping them as before. In particular, it will make it possible to save the transforms into a standalone file, and having custom__str__
implementations.cc @eellison for feedback on replacing standard classes with classes inheriting from
nn.Module
for a more seamless support for torchscript.This is a re-do of #2278, but this time from a branch from
pytorch/vision
, so that I can try stacking PRs together