Skip to content

Commit

Permalink
Make normalization capitalization insensitive.
Browse files Browse the repository at this point in the history
  • Loading branch information
akoumpa committed Dec 20, 2023
1 parent c1b0eb0 commit 4a4e725
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -1518,7 +1518,7 @@ def build_transformer_config(self) -> TransformerConfig:
gated_linear_unit = activation.endswith('glu')
activation_func = activation_to_func(activation)

normalization = self.cfg.get('normalization', 'layernorm')
normalization = self.cfg.get('normalization', 'layernorm').lower()
layernorm_zero_centered_gamma = self.cfg.get('normalization', 'layernorm') == 'layernorm1p'
if normalization == 'layernorm':
normalization = 'LayerNorm'
Expand Down

0 comments on commit 4a4e725

Please sign in to comment.