Skip to content

Optim-wip: Add JIT support to all transforms & some image parameterizations #821

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

ProGamerGov
Copy link
Contributor

@ProGamerGov ProGamerGov commented Dec 13, 2021

  • Added JIT support to all transforms.

  • Added JIT tests for all applicable transforms.

  • There's currently a bug that prevents CenterCrop from working with JIT, so I added the @torch.jit.ignore decorator to it's forward function to avoid the bug. This is so that it won't throw an error if someone uses torch.jit.script on it. JIT is overriding variables of type List[int] to type Tuple[int, int] for an unknown reason during function calls pytorch#69938

  • JIT is also told to ignore NChannelsToRGB due to it's unsupported inner functions.

  • The ToRGB didn't support JIT due to the usage of named dimensions, so I implemented a second forward function that is only used when using JIT.

  • Add JIT support NaturalImage, FFTImage, & PixelImage.

  • Improved NaturalImage & FFTImage tests, and test coverage.

  • Added ImageParameterization instance support for NaturalImage. This improvement should make it easier to use parameterization enhancements like SharedImage, and will be helpful for custom parameterizations that don't use the standard input variable set (size, channels, batch, & init).

  • Added asserts to verify NaturalImage parameterization inputs are instances or types of ImageParameterization.

  • Some transform type hints had to be changed due to a bug in JIT's Union support: JIT RuntimeError: 'Union[Tensor, List[float], List[int]]' object is not subscriptable pytorch#69434

  • Fixed bug with RandomCrop transform.

  • Replaced default RandomScale transform with Interpolation based variant. Renamed the old variant to RandomScaleAffine.

  • CenterCrop / center_crop now adds padding if the crop size is larger than the input dimensions.

  • Add distributions support to both versions of RandomScale. Ludwig wanted this based off of his original PR.

  • Changed NumSeqOrTensorType hint to NumSeqOrTensorOrProbDistType.

  • Added more comprehensive testing to applicable transforms. Test coverage reported by pytest should now be 100% minus the version specific tests.

  • Transform torch version checks required for module forward functions are performed in their __init__ function so that they work with JIT.

  • Add TransformationRobustness() transform.

    pad_module = torch.nn.ConstantPad2d(2, value=0.5)
    transforms = opt.transforms.TransformationRobustness(padding_transform=pad_module)

JIT support for the InceptionV1 model was added to #655

This PR only changes / adds around 711 lines of code, with the other 1306 lines being from tests that feature a lot of redundant / simple code.

@ProGamerGov
Copy link
Contributor Author

Looks like there was an error with the Insights install in the tests. Everything optim related is working fine however.

@ProGamerGov ProGamerGov force-pushed the optim-wip-jit-transforms-support branch 2 times, most recently from ab73da6 to afc3434 Compare December 17, 2021 15:54
* JIT support for `center_crop`.
* Improve some transform tests.
* Fix `RandomCrop` transform bug.
@ProGamerGov ProGamerGov force-pushed the optim-wip-jit-transforms-support branch from 4a1ec1c to 9952809 Compare December 20, 2021 20:44
* Replace Affine `RandomScale` with Interpolation based variant. Renamed old variant to `RandomScaleAffine`.
* `CenterCrop` & `center_crop` now use padding if the crop size is larger than the input dimensions.
* Add distributions support to both versions of `RandomScale`.
* Improve transform tests.
* Add `torch.distributions.distribution.Distribution` to `NumSeqOrTensorType` type hint.
* Added `TransformationRobustness()` transform.
* Fixed bug with `center_crop` padding code, and added related tests to `center_crop` & `CenterCrop`.
@ProGamerGov ProGamerGov changed the title Optim-wip: Add JIT support to most transforms Optim-wip: Add JIT support to all transforms & some image paramterizations Dec 27, 2021
@ProGamerGov ProGamerGov force-pushed the optim-wip-jit-transforms-support branch from 6d832db to d2583d4 Compare December 27, 2021 22:19
* Add JIT support `NaturalImage`, `FFTImage`, & `PixelImage`.
* Added proper JIT support for `ToRGB`.
* Improved `NaturalImage` & `FFTImage` tests, and test coverage.
* Added `ImageParameterization` instance support for `NaturalImage`. This improvement should make it easier to use parameterization enhancements like SharedImage, and will be helpful for custom parameterizations that don't use the standard input variable set (size, channels, batch, & init).
* Added asserts to verify `NaturalImage` parameterization inputs are instances or types of `ImageParameterization`.
@ProGamerGov ProGamerGov changed the title Optim-wip: Add JIT support to all transforms & some image paramterizations Optim-wip: Add JIT support to all transforms & some image parameterizations Dec 31, 2021
This should make it easier to work with the ToRGB module as many PyTorch functions still don't work with named dimensions yet.
* The maximum of 4 channels isn't required as we ignore all channels after 3.
The `linear` mode only supports 3D inputs, and `trilinear` only supports 5D inputs. RandomScale only uses 4D inputs, so only `nearest`, `bilinear`, `bicubic`, & `area` are supported.
…mationRobustness`

* Change `RandomRotation` type hint from `NumSeqOrTensorType` to `NumSeqOrTensorOrProbDistType`.
* Uncomment `RandomRotation` from `TransformationRobustness` & tests.
@ProGamerGov
Copy link
Contributor Author

ProGamerGov commented Feb 15, 2022

I've been using the TransformationRobustness in all of the newer notebooks due to it's ease of use and because it dramatically improves visualization quality.

The optimizing with transparency tutorial notebook requires this PR for the TransformationRobustness transform and the fixes to RandomCrop.

@NarineK NarineK merged commit cd672a0 into pytorch:optim-wip May 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants