Skip to content

Actions: triple-Mu/vllm_official

pre-commit

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
8 workflow runs
8 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

[Bugfix] Fix 'ModuleNotFoundError: No module named 'intel_extension_f…
pre-commit #8: Commit 022bcc7 pushed by triple-Mu
February 5, 2025 12:56 4m 32s main
February 5, 2025 12:56 4m 32s
[Doc] Replace ibm-fms with ibm-ai-platform (#12709)
pre-commit #7: Commit bb392af pushed by triple-Mu
February 4, 2025 07:07 4m 31s main
February 4, 2025 07:07 4m 31s
Support Pixtral-Large HF by using llava multimodal_projector_bias con…
pre-commit #6: Commit 5d98d56 pushed by triple-Mu
February 4, 2025 04:37 4m 34s main
February 4, 2025 04:37 4m 34s
Fix for attention layers to remain unquantized during moe_wn16 quant …
pre-commit #5: Commit b998645 pushed by triple-Mu
February 3, 2025 06:18 4m 33s main
February 3, 2025 06:18 4m 33s
[doc][misc] clarify VLLM_HOST_IP for multi-node inference (#12667)
pre-commit #4: Commit e643309 pushed by triple-Mu
February 3, 2025 02:00 5m 26s main
February 3, 2025 02:00 5m 26s
[Core] Make raw_request optional in ServingCompletion (#12503)
pre-commit #3: Commit 2079e43 pushed by triple-Mu
January 28, 2025 14:46 5m 23s main
January 28, 2025 14:46 5m 23s
[ci/build] sync default value for wheel size (#12398)
pre-commit #2: Commit e784c6b pushed by triple-Mu
January 24, 2025 14:52 4m 30s main
January 24, 2025 14:52 4m 30s
[Doc] Troubleshooting errors during model inspection (#12351)
pre-commit #1: Commit d07efb3 pushed by triple-Mu
January 23, 2025 15:05 5m 26s main
January 23, 2025 15:05 5m 26s