Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update basic_ae_module.py #570

Closed
wants to merge 1 commit into from

Conversation

dpkpathak
Copy link

AE class instantiation is necessary to load pretrained model, as from_pretrained is not a static method.

What does this PR do?

It updates a docstring in the basic_ae_module for AE class, which corrects the required way to instantiate a pretrained autoencoder for cifar.

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests? [not needed for typos/docs]
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

  • Is this pull request ready for review? (if not, please submit in draft mode)

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

`AE` class instantiation is necessary to load pretrained model, as from_pretrained is not a static method.
@codecov
Copy link

codecov bot commented Feb 28, 2021

Codecov Report

Merging #570 (b046c1e) into master (1bec122) will decrease coverage by 0.52%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #570      +/-   ##
==========================================
- Coverage   78.03%   77.51%   -0.53%     
==========================================
  Files         115      115              
  Lines        6701     6701              
==========================================
- Hits         5229     5194      -35     
- Misses       1472     1507      +35     
Flag Coverage Δ
cpu ?
pytest ?
unittests 77.51% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
...ts/models/autoencoders/basic_ae/basic_ae_module.py 88.15% <ø> (ø)
pl_bolts/utils/warnings.py 38.46% <0.00%> (-61.54%) ⬇️
...s/models/detection/components/_supported_models.py 71.42% <0.00%> (-28.58%) ⬇️
pl_bolts/datasets/mnist_dataset.py 36.36% <0.00%> (-9.10%) ⬇️
pl_bolts/models/detection/faster_rcnn/backbones.py 84.61% <0.00%> (-7.70%) ⬇️
pl_bolts/transforms/dataset_normalizations.py 80.00% <0.00%> (-5.00%) ⬇️
pl_bolts/callbacks/vision/image_generation.py 87.87% <0.00%> (-3.04%) ⬇️
..._bolts/models/self_supervised/simclr/transforms.py 75.71% <0.00%> (-2.86%) ⬇️
pl_bolts/datasets/ssl_amdim_datasets.py 73.68% <0.00%> (-2.64%) ⬇️
pl_bolts/models/self_supervised/swav/transforms.py 61.03% <0.00%> (-2.60%) ⬇️
... and 12 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1bec122...1332697. Read the comment docs.

@akihironitta akihironitta added documentation Improvements or additions to documentation duplicate This issue or pull request already exists labels Mar 4, 2021
@akihironitta
Copy link
Contributor

@dpkpathak The doc is being fixed in #557, so let me close this PR. Thank you for your contribution!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation duplicate This issue or pull request already exists model
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants