Skip to content

Conversation

@MengqingCao
Copy link
Collaborator

@MengqingCao MengqingCao commented Apr 11, 2025

What this PR does / why we need it?

add hf-token for llama model download in #487

@MengqingCao
Copy link
Collaborator Author

We'll test hf-token in branch v0.7.3-dev first.
If it dosen't break CI, we'll pick this pr to main

@MengqingCao MengqingCao force-pushed the token branch 5 times, most recently from 41997d4 to faa04ce Compare April 14, 2025 11:04
@MengqingCao MengqingCao reopened this Apr 14, 2025
@MengqingCao MengqingCao force-pushed the token branch 5 times, most recently from d7e812e to 78efa5d Compare April 16, 2025 02:31
@wangxiyuan wangxiyuan changed the title [CI] add hf-token for llama model download [0.7.3][CI] add hf-token for llama model download Apr 18, 2025
Signed-off-by: MengqingCao <cmq0113@163.com>
- name: Run vllm-project/vllm-ascend key feature test
env:
HF_TOKEN: ${{ secrets.HF_TOKEN }}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

HF_TOKEN is added at L51 globally. do we still this change?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need here, I'll close this pr, thx!

@MengqingCao MengqingCao deleted the token branch July 8, 2025 02:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants