Skip to content

Conversation

@chyundunovDatamonsters
Copy link
Contributor

Description

Adding ROCm-based vLLM deployment capability and adapting DocSum deployment using vLLM ROCm

Issues

Type of change

List the type of change like below. Please delete options that are not relevant.

  • [*] New feature (non-breaking change which adds new functionality)

Dependencies

Tests

ezelanza and others added 2 commits April 5, 2025 00:41
Typo fix

Signed-off-by: Ezequiel Lanza <ezequiel.lanza@gmail.com>
Signed-off-by: Chingis Yundunov <YundunovCN@sibedge.com>
…oyment using vLLM ROCm

Signed-off-by: Chingis Yundunov <YundunovCN@sibedge.com>
@lianhao
Copy link
Collaborator

lianhao commented Apr 7, 2025

@chyundunovDatamonsters again, pls also update the valuefiles too.

@lianhao
Copy link
Collaborator

lianhao commented Apr 7, 2025

@chensuyue @yongfengdu we need CI setup for AMD ROC here too

@chensuyue
Copy link
Collaborator

@chensuyue @yongfengdu we need CI setup for AMD ROC here too

Sure, I will follow this.

@yongfengdu yongfengdu merged commit 46e7a57 into opea-project:main Apr 24, 2025
45 of 49 checks passed
@lianhao lianhao added the rocm label Apr 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants