Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

examples/run_summarization_no_trainer: fixed incorrect param to hasattr #18720

Merged
merged 3 commits into from
Aug 24, 2022

Conversation

rahular
Copy link
Contributor

@rahular rahular commented Aug 22, 2022

What does this PR do?

Fixes a small bug in examples/run_summarization_no_trainer.py which resulted in the script not checkpointing models even if the correct argument was passed from CLI.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

@sgugger @patil-suraj

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

@LysandreJik
Copy link
Member

Would you like to take a look at this, @muellerzr?

@muellerzr
Copy link
Contributor

@rahular this isn't quite the right solution. What this needs to check here is whether or not the checkpointing steps passed were either "epoch" or a digit. So what this currently does is ignore if epoch was actually passed. With this knowledge in mind do you want to give it another go? I'll be able to give a more detailed review if you're stuck etc in a few hours 😄

@rahular
Copy link
Contributor Author

rahular commented Aug 24, 2022

Hi @muellerzr, checkpointing_steps = args.checkpointing_steps should already take care of the epoch case, if I'm not mistaken. Here's the flow in comments.

    if hasattr(args, "checkpointing_steps"):                        # check if `args` has `checkpointing_steps`
        checkpointing_steps = args.checkpointing_steps              # stores whatever was passed as a local variable (including `epoch`)  
        if args.checkpointing_steps.isdigit():                      # check if the passed argument is a digit
            checkpointing_steps = int(args.checkpointing_steps)     # overwrite the local variable with the digit after casting
    else:                                                           # local variable is `None` if `args` does not have `checkpointing_steps`
        checkpointing_steps = None

Let me know what you think.

@muellerzr
Copy link
Contributor

muellerzr commented Aug 24, 2022

@rahular good point! I think in that case something like the following chain may be how we want to do this:

checkpointing_steps = args.checkpointing_steps
if checkpointing_steps is not None and checkpointing_steps.isdigit():
        checkpointing_steps = int(checkpointing_steps)

Since it's always present (it's part of the arg parser) and we really care if it's not None or if it's a digit. Does this make sense to you as well?

(And the default is None anyways)

@rahular
Copy link
Contributor Author

rahular commented Aug 24, 2022

@muellerzr makes sense! Have simplified it and pushed.

@muellerzr muellerzr self-requested a review August 24, 2022 15:29
Copy link
Contributor

@muellerzr muellerzr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great job!

Likely the rest of the no_trainer scripts will need this update as well. Would you like to tackle this? :) (This can be done in a separate PR)

@muellerzr
Copy link
Contributor

Also make sure to run make style; make quality from the base directory of your fork, to fix the style failure

@rahular
Copy link
Contributor Author

rahular commented Aug 24, 2022

@muellerzr Sure I can take a look at other trainers as well!

@rahular
Copy link
Contributor Author

rahular commented Aug 24, 2022

@muellerzr let's close this PR and I will update the other trainers and create a new request.

@muellerzr muellerzr merged commit c55d6e4 into huggingface:main Aug 24, 2022
@rahular rahular deleted the sum_checkpoint_fix branch August 24, 2022 21:19
oneraghavan pushed a commit to oneraghavan/transformers that referenced this pull request Sep 26, 2022
…tr (huggingface#18720)

* fixed incorrect param to hasattr

* simplified condition checks

* code cleanup
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants