-
Notifications
You must be signed in to change notification settings - Fork 323
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Revision of SimCLR transforms #857
Revision of SimCLR transforms #857
Conversation
Concerning my PR, I'm not yet sure if it's up to the repo's standards. Therefore, I have the following questions before I can complete my work.
Thanks. |
"""Transforms for SimCLR during the fine-tuning stage. | ||
|
||
Transform:: | ||
|
||
Resize(input_height + 10, interpolation=3) | ||
transforms.CenterCrop(input_height), | ||
transforms.ToTensor() | ||
|
||
Example:: | ||
|
||
from pl_bolts.models.self_supervised.simclr.transforms import SimCLREvalDataTransform | ||
|
||
transform = SimCLREvalDataTransform(input_height=32) | ||
x = sample() | ||
(_, _, xk) = transform(x) | ||
""" | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, I'm not too versed in SimCLR, but either this docstring is wrong or the __call__
is wrong since it definitely does not return 3-tuple.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for spotting this ❤️
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, I'm sorry it took me so long. Overall it looks good, I'd just take a bit harder look at SimCLRFinetuneTransform
. ⚡
@@ -129,8 +119,24 @@ def __init__( | |||
) | |||
|
|||
|
|||
@under_review() | |||
class SimCLRFinetuneTransform: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can this somehow be also subclass of SimCLRTrainTransform
? Seems like quite a bit of code is duplicated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, will do.
That usually concerns whether documentation is up to date with respect to your changes. Quite a bit of documentation is generated from docstrings, so you should be okay with that. However, you can take a look yourself locally by running
This depends on the specs of your machine I'm afraid. On my local machine (32GB, i7-11850H, T1200 4GB) it takes about 20 min to run everything and 2 of the tests fail because my GPU runs out of memory 😂 Try to run as much as possible and if you're not unable to test something locally, it's fine to use CI to warn you about any failing tests Which reminds me, you had one test failure on the PR on |
for more information, see https://pre-commit.ci
Hi, @ArnolFokam! Sorry for revoking "the decision is yours", but it seems if we bump the torchvision, we also need to bump torch, since those versions are very much interlinked. And we don't want to do that, mostly because of PL, we want to be able to support as many versions as PL supports. I took the liberty of reverting that and silencing the warning. If the CI passes, I will approve it and merge it ⚡ |
Co-authored-by: otaj <ota@lightning.ai> Co-authored-by: arnol <fokammanuel1@students.wits.ac.za>
What does this PR do?
Fixes part of #839
pl_bolts.models.self_supervised.simclr.transforms.GaussianBlur
(removed in favor of torchvision's implementation)pl_bolts.models.self_supervised.simclr.transforms.SimCLREvalDataTransform
(add unit test)pl_bolts.models.self_supervised.simclr.transforms.SimCLRFinetuneTransform
(add unit test and doc string)pl_bolts.models.self_supervised.simclr.transforms.SimCLRTrainDataTransform
(add unit test)Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃