Skip to content

Commit 0380da9

Browse files
vasquDarkLight1337
authored andcommitted
[Ernie 4.5] Name Change for Base 0.3B Model (vllm-project#21735)
Signed-off-by: vasqu <antonprogamer@gmail.com> Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk> Co-authored-by: DarkLight1337 <tlleungac@connect.ust.hk>
1 parent 8090598 commit 0380da9

File tree

4 files changed

+8
-8
lines changed

4 files changed

+8
-8
lines changed

docs/models/supported_models.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -336,7 +336,7 @@ th {
336336
| `DeepseekV2ForCausalLM` | DeepSeek-V2 | `deepseek-ai/DeepSeek-V2`, `deepseek-ai/DeepSeek-V2-Chat`, etc. | | ✅︎ | ✅︎ |
337337
| `DeepseekV3ForCausalLM` | DeepSeek-V3 | `deepseek-ai/DeepSeek-V3-Base`, `deepseek-ai/DeepSeek-V3`, etc. | | ✅︎ | ✅︎ |
338338
| `Dots1ForCausalLM` | dots.llm1 | `rednote-hilab/dots.llm1.base`, `rednote-hilab/dots.llm1.inst`, etc. | | ✅︎ | ✅︎ |
339-
| `Ernie4_5_ForCausalLM` | Ernie4.5 | `baidu/ERNIE-4.5-0.3B-PT`, etc. | ✅︎ | ✅︎ | ✅︎ |
339+
| `Ernie4_5ForCausalLM` | Ernie4.5 | `baidu/ERNIE-4.5-0.3B-PT`, etc. | ✅︎ | ✅︎ | ✅︎ |
340340
| `Ernie4_5_MoeForCausalLM` | Ernie4.5MoE | `baidu/ERNIE-4.5-21B-A3B-PT`, `baidu/ERNIE-4.5-300B-A47B-PT`, etc. |✅︎| ✅︎ | ✅︎ |
341341
| `ExaoneForCausalLM` | EXAONE-3 | `LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct`, etc. | ✅︎ | ✅︎ | ✅︎ |
342342
| `Exaone4ForCausalLM` | EXAONE-4 | `LGAI-EXAONE/EXAONE-4.0-32B`, etc. | ✅︎ | ✅︎ | ✅︎ |
@@ -634,10 +634,10 @@ Some models are supported only via the [Transformers backend](#transformers). Th
634634
|--------------|--------|--------|-------------------|-----------------------------|-----------------------------------------|---------------------|
635635
| `Emu3ForConditionalGeneration` | Emu3 | T + I | `BAAI/Emu3-Chat-hf` | ✅︎ | ✅︎ | ✅︎ |
636636

637-
<sup>^</sup> You need to set the architecture name via `--hf-overrides` to match the one in vLLM.
638-
&nbsp;&nbsp;&nbsp;&nbsp;• For example, to use DeepSeek-VL2 series models:
639-
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;`--hf-overrides '{"architectures": ["DeepseekVLV2ForCausalLM"]}'`
640-
<sup>E</sup> Pre-computed embeddings can be inputted for this modality.
637+
<sup>^</sup> You need to set the architecture name via `--hf-overrides` to match the one in vLLM.
638+
&nbsp;&nbsp;&nbsp;&nbsp;• For example, to use DeepSeek-VL2 series models:
639+
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;`--hf-overrides '{"architectures": ["DeepseekVLV2ForCausalLM"]}'`
640+
<sup>E</sup> Pre-computed embeddings can be inputted for this modality.
641641
<sup>+</sup> Multiple items can be inputted per text prompt for this modality.
642642

643643
!!! warning

tests/models/registry.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -166,7 +166,7 @@ def check_available_online(
166166
trust_remote_code=True),
167167
"DeepseekV3ForCausalLM": _HfExamplesInfo("deepseek-ai/DeepSeek-V3", # noqa: E501
168168
trust_remote_code=True),
169-
"Ernie4_5_ForCausalLM": _HfExamplesInfo("baidu/ERNIE-4.5-0.3B-PT",
169+
"Ernie4_5ForCausalLM": _HfExamplesInfo("baidu/ERNIE-4.5-0.3B-PT",
170170
min_transformers_version="4.54"),
171171
"Ernie4_5_MoeForCausalLM": _HfExamplesInfo("baidu/ERNIE-4.5-21B-A3B-PT",
172172
min_transformers_version="4.54"),

vllm/model_executor/models/ernie45.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@
2828
from .utils import PPMissingLayer
2929

3030

31-
class Ernie4_5_ForCausalLM(LlamaForCausalLM):
31+
class Ernie4_5ForCausalLM(LlamaForCausalLM):
3232

3333
def __init__(self, *, vllm_config: VllmConfig, prefix: str = ""):
3434
super().__init__(vllm_config=vllm_config, prefix=prefix)

vllm/model_executor/models/registry.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@
6060
"DeepseekV2ForCausalLM": ("deepseek_v2", "DeepseekV2ForCausalLM"),
6161
"DeepseekV3ForCausalLM": ("deepseek_v2", "DeepseekV3ForCausalLM"),
6262
"Dots1ForCausalLM": ("dots1", "Dots1ForCausalLM"),
63-
"Ernie4_5_ForCausalLM": ("ernie45", "Ernie4_5_ForCausalLM"),
63+
"Ernie4_5ForCausalLM": ("ernie45", "Ernie4_5ForCausalLM"),
6464
"Ernie4_5_MoeForCausalLM": ("ernie45_moe", "Ernie4_5_MoeForCausalLM"),
6565
"ExaoneForCausalLM": ("exaone", "ExaoneForCausalLM"),
6666
"Exaone4ForCausalLM": ("exaone4", "Exaone4ForCausalLM"),

0 commit comments

Comments
 (0)