Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add reference test for normalize_image_tensor #7119

Merged
merged 4 commits into from
Jan 23, 2023

Conversation

pmeier
Copy link
Collaborator

@pmeier pmeier commented Jan 20, 2023

Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @pmeier , LGTM.
The ElasticTransform seems unrelated (I've seen it fail locally as well, it might be a bit flaky?)

test/test_prototype_transforms_functional.py Outdated Show resolved Hide resolved
@pmeier
Copy link
Collaborator Author

pmeier commented Jan 23, 2023

(I've seen it fail locally as well, it might be a bit flaky?)

Yeah, we should add some tolerances. What worries me a little is the really high difference albeit only for a single pixel:

  FAILED test/test_prototype_transforms_consistency.py::test_call_consistency[ElasticTransform-60] - AssertionError: Tensor image consistency check failed with: 
  
  Tensor-likes are not close!
  
  Mismatched elements: 1 / 318828 (0.0%)
  Greatest absolute difference: 255 at index (2, 0, 159, 72) (up to 1 allowed)
  Greatest relative difference: 1.0 at index (2, 0, 159, 72) (up to 0.1 allowed)

I'll look into it.

@NicolasHug
Copy link
Member

I don't know if you're still working on these asset_close utils, but it'd be really nice if they could support something like

assert_close(atol, rtol, its_OK_if_that_many_entries_are_off=2)

the name is obviously terrible but I hope it illustrates the intent: often tests fail with just one or a few pixels don't satisfy the condition, and that's OK.

@pmeier
Copy link
Collaborator Author

pmeier commented Jan 23, 2023

often tests fail with just one or a few pixels don't satisfy the condition, and that's OK.

You are right, but these are very vision centric. Although not publicly documented, we made it really easy to extend assert_close. For example, we have

class ImagePair(TensorLikePair):

that we add to assert_close so we can do assert_close(tensor_image, pil_image). Plus, it already has support for MAE comparison. That is not 100% what you want, but goes into the same direction: by comparing the MAE, a single pixel in a reasonably sized image makes almost no difference.

@pmeier pmeier merged commit c206a47 into pytorch:main Jan 23, 2023
@pmeier pmeier deleted the normalize-correctness branch January 23, 2023 11:21
@NicolasHug
Copy link
Member

I wouldn't say they're vision centric; in my scikit-learn days I would have killed for such a feature in the numpy's testing utils. I feel like it's relevant in any scenario when you want to compare an output tensor/array with a reference one.

Re MAE: mean is sensitive to outliers so even if the tensor has a lot of values it can still be quite different - and it's not as easy to hard-code an expected MAE threshold.

@pmeier
Copy link
Collaborator Author

pmeier commented Jan 23, 2023

Ok, let me clarify:

fail with just one or a few pixels [emphasis mine]

If you are talking about general tensors / arrays I agree that this is not related to vision. I personally only ever encountered this issue with images so I never thought about this being a more general problem. Maybe send an issue to PyTorch core? I'll discuss this with the others if something like that is something we want to support on a public API. I'm currently leaning towards no, since we already have a sh*t ton of parameters and adding more that also tie in the actual comparison won't make it any easier.

If you just want this for torchvision, I'm happy to implement a version for us that has this.

facebook-github-bot pushed a commit that referenced this pull request Jan 24, 2023
Reviewed By: YosuaMichael

Differential Revision: D42706905

fbshipit-source-id: 0a3c5357d5e59454f265ec36fe52eae32da0b59f
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants