Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Fixed import for torch 1.10 release #129

Merged
merged 8 commits into from
Feb 13, 2022

Conversation

MateoLostanlen
Copy link
Member

Small PR to fix our unitest

The function load_state_dict_from_url is now part of torch.hub

We need to update the import accordingly

We must therefore set the minimal version of pytorch to 1.10

@MateoLostanlen MateoLostanlen requested review from frgfm and a team October 25, 2021 20:45
@MateoLostanlen MateoLostanlen self-assigned this Oct 25, 2021
@MateoLostanlen MateoLostanlen added the type: bug Something isn't working label Oct 25, 2021
@MateoLostanlen
Copy link
Member Author

MateoLostanlen commented Oct 25, 2021

I wonder if setting the minimal version of pytorch to the last release is not too restrictive. @frgfm what do you think ?

It seems you have the same issue in holocron :)

@frgfm
Copy link
Member

frgfm commented Oct 26, 2021

I wonder if setting the minimal version of pytorch to the last release is not too restrictive. @frgfm what do you think ?

It seems you have the same issue in holocron :)

Yup I agree, either I update it or do a conditional import by checking the library version 👍

Considering 0.10.0 is brand new, either we decide to:

  • update it because there are features that we consider critical in this release
  • add an upper constraint (<0.10.0)
  • make the import conditional

I would go for the latter in this situation 👍

@MateoLostanlen
Copy link
Member Author

It seems to be the best solution indeed, I opened a PR on holocron to solve the problem.

Copy link
Member

@frgfm frgfm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually the function is available in PyTorch since 1.1.0 (https://github.com/pytorch/pytorch/blob/v1.1.0/torch/hub.py#L385), and torchvision was only making a mirror for it up until the last release! So we only need:

from torch.hub import load_state_dict_from_url

@MateoLostanlen
Copy link
Member Author

MateoLostanlen commented Nov 1, 2021

Hi @frgfm, thanks for catching this !

frgfm
frgfm previously approved these changes Nov 1, 2021
Copy link
Member

@frgfm frgfm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the edits!

@frgfm
Copy link
Member

frgfm commented Nov 1, 2021

About the unittest crash, I'll make a release of Holocron rather soon, we'll need to update the version constraint ;)

@MateoLostanlen
Copy link
Member Author

About the unittest crash, I'll make a release of Holocron rather soon, we'll need to update the version constraint ;)

Ok Let me know when it's done I will wait to go further with this PR and set holocron minimal requirement (0.1.4 I guess).

@frgfm
Copy link
Member

frgfm commented Nov 14, 2021

About the unittest crash, I'll make a release of Holocron rather soon, we'll need to update the version constraint ;)

Ok Let me know when it's done I will wait to go further with this PR and set holocron minimal requirement (0.1.4 I guess).

To avoid waiting for the release, I suggest switching all pylocron requirements (here and here) from

pylocron>=0.1.3

to

-e git+https://github.com/frgfm/Holocron.git#egg=pylocron

This will install the latest version from the git master branch instead of the release 👍

@MateoLostanlen
Copy link
Member Author

Hi @frgfm, unfortunately we can' t use e git+https://github.com/frgfm/Holocron.git#egg=pylocron in setup.py. A simple solution while waiting for the new holocron release is to install the requirements and then the package as I do here.

The problem seems to be solved for the moment.

Only one test doesn't pass but it's always the same problem of openfire which has less and less image available. This problem will be solved after the merge of #121

@MateoLostanlen MateoLostanlen requested a review from frgfm February 12, 2022 19:48
Copy link
Member

@frgfm frgfm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I added a few suggestions

.github/workflows/builds.yml Outdated Show resolved Hide resolved
requirements.txt Outdated Show resolved Hide resolved
setup.py Outdated Show resolved Hide resolved
@MateoLostanlen
Copy link
Member Author

Hi @frgfm, I think you did not understand. I propose here a temporary fix while you make your release because it blocks the pr on vision and engine for the moment. This modification allows to install holocron from the repo without waiting for the release. So you can take your time :)

@MateoLostanlen
Copy link
Member Author

ah ok you release 0.2.0, I will make the change then

@MateoLostanlen MateoLostanlen requested a review from frgfm February 13, 2022 17:09
@MateoLostanlen
Copy link
Member Author

all good :)

Copy link
Member

@frgfm frgfm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All good!

@frgfm frgfm changed the title fix import for torch 1.10 release fix: Fixed import for torch 1.10 release Feb 13, 2022
@frgfm frgfm added the module: models Related to models label Feb 13, 2022
@frgfm frgfm added this to the 0.2.0 milestone Feb 13, 2022
@MateoLostanlen MateoLostanlen merged commit ef8f137 into pyronear:master Feb 13, 2022
@MateoLostanlen MateoLostanlen deleted the fix_for_pytorch_1.10 branch March 17, 2022 19:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: models Related to models type: bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants