Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transformer divisibility error validation #3105

Merged
merged 6 commits into from
Feb 21, 2023

Conversation

jeffkinnison
Copy link
Contributor

Followup to #3066 that adds a validation check for encoder.hidden_size % encoder.num_heads == 0 for transformer encoders.

@jeffkinnison jeffkinnison self-assigned this Feb 15, 2023
@github-actions
Copy link

github-actions bot commented Feb 16, 2023

Unit Test Results

         6 files  ±  0           6 suites  ±0   5h 55m 39s ⏱️ - 5m 10s
  3 962 tests +  6    3 925 ✔️ +  6    37 💤 ±0  0 ±0 
11 883 runs  +59  11 772 ✔️ +58  111 💤 +1  0 ±0 

Results for commit 8994163. ± Comparison against base commit c638c23.

♻️ This comment has been updated with latest results.

Comment on lines 353 to 360
sequence_types = [SEQUENCE, TEXT, TIMESERIES]
for sequence_type in sequence_types:
encoder = config.defaults.__getattribute__(sequence_type).encoder
if encoder.type == "transformer" and not is_divisible(encoder.hidden_size, encoder.num_heads):
raise ConfigValidationError(
f"Default {sequence_type} transformer encoder requires encoder.hidden_size to be divisible by "
f"encoder.num_heads. Found hidden_size {encoder.hidden_size} and num_heads {encoder.num_heads}."
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we might be able to get away without checking config.defaults as all of these auxiliary checks are run after defaults have been resolved into feature configs. WDYT?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, cool! I'll remove the defaults block.

}

with pytest.raises(ConfigValidationError):
validate_config(config)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use ModelConfig.from_dict(config) instead of validate_config, which I'm planning on removing in #3104

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So all of the tests fail when I use ModelConfig.from_dict. Do we need to wait for the PR to land before updating?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jeffkinnison you're totally right! Let me try to land that PR first.

@jeffkinnison jeffkinnison force-pushed the transformer-divisibility-error branch from 81d0be2 to e325b86 Compare February 17, 2023 23:01
@jeffkinnison jeffkinnison merged commit 6fd8cb0 into master Feb 21, 2023
@jeffkinnison jeffkinnison deleted the transformer-divisibility-error branch February 21, 2023 14:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants