Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow train.py-like config for eval.py #1351

Merged
merged 14 commits into from
Jul 23, 2024
Merged

Conversation

josejg
Copy link
Contributor

@josejg josejg commented Jul 12, 2024

Currently train.py requires a config like

model:
   <model_kwargs>
load_path: foobar
tokenizer:
   <tokenizer_kwargs>

whereas eval.py requires:

models:
  - model:
        <model_kwargs>
     tokenizer:
        <tokenizer_kwargs>
      load_path: foobar
      model_name: my_model

This PR allows the user to run eval.py using the train.py syntax which is easier when editing yamls directly.

I tried to make the implementation fully backwards compatible and cover all edge cases (both keys specified, missing top level keys like tokenizer, &c)

EDIT: Run debug-eval-llama-3x-8b-g16-d10-B8JDJd completed sucessfully and the config used model and tokenizer directly

@josejg josejg requested a review from a team as a code owner July 12, 2024 00:56
@irenedea
Copy link
Contributor

This looks reasonable to me. Want to see if Milo has any thoughts on how to make this nicer with the config_utils.py he added. (Pinged Milo bc I can't add him as a reviewer for some reason)

Copy link
Contributor

@milocress milocress left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good, I like that we don't have to modify the EvalConfig dataclass and the transformation happens at a lower level. One question is whether the logged config should have the original syntax of the file or not?

I lean towards yes, in which case you may want to make a transform that's passed to make_dataclass_and_log_config which packages your transformation into a function which is applied only to the eval config and not to the logged config.

I think it's fairly important that the config that gets logged is exactly equal to the config specified in the file.

You can pass a function to make_dataclass_and_log_config as specified here which does your config transformation.

here is a good example of a config transform, I think this PR can be formatted in the same way.

Copy link
Collaborator

@dakinggg dakinggg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, could go either way on milo's suggestion

@josejg
Copy link
Contributor Author

josejg commented Jul 17, 2024

I like @milocress suggestion, will retry to rework it as a config transform. Some clarifications:

  • Should I add it to the transform registry?
  • The logic would be sth like "if model in config, pass transform to the make_dataclass_and_log_config fn" ?

@dakinggg
Copy link
Collaborator

dakinggg commented Jul 17, 2024

The existing config transforms were added for train (and will all be applied for train) so i'd probably just make a function for now and not register it. We can revisit later if there is reason to.

and id suggest not conditionally applying the transform, but rather the transform itself handling the if model in config check

Copy link
Collaborator

@dakinggg dakinggg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, please add a manual test run in the pr description showing the functionality works

scripts/eval/eval.py Outdated Show resolved Hide resolved
@josejg
Copy link
Contributor Author

josejg commented Jul 17, 2024

Updated the PR with a run that completed successfully with this transformation

@josejg josejg enabled auto-merge (squash) July 22, 2024 23:42
@josejg josejg merged commit eb41a6e into mosaicml:main Jul 23, 2024
9 checks passed
@josejg josejg mentioned this pull request Jul 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants