Skip to content

Commit d5817c0

Browse files
bogdanminkogemini-code-assist[bot]
authored andcommitted
[Bugfix] Correct LayerNorm epsilon parameter in modernbert.py (vllm-project#27008)
Signed-off-by: bogdanm <152898065+bogdan01m@users.noreply.github.com> Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com> Signed-off-by: xuebwang-amd <xuebwang@amd.com>
1 parent 532a507 commit d5817c0

File tree

1 file changed

+5
-2
lines changed

1 file changed

+5
-2
lines changed

vllm/model_executor/models/modernbert.py

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,9 +39,12 @@ def __init__(self, config: ModernBertConfig):
3939
self.tok_embeddings = VocabParallelEmbedding(
4040
config.vocab_size, config.hidden_size
4141
)
42-
self.norm = nn.LayerNorm(
43-
config.hidden_size, eps=config.layer_norm_eps, bias=config.norm_bias
42+
eps = (
43+
getattr(config, "norm_eps", None)
44+
or getattr(config, "layer_norm_eps", None)
45+
or 1e-5
4446
)
47+
self.norm = nn.LayerNorm(config.hidden_size, eps=eps, bias=config.norm_bias)
4548

4649
def get_input_embeddings(self, input_ids: torch.Tensor) -> torch.Tensor:
4750
return self.tok_embeddings(input_ids)

0 commit comments

Comments
 (0)