-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to use ollama in the ipex-llm docker container #12654
Comments
i notice there are some unnormal log:
|
@ca1ic0 I can't reproduce your problem on your |
i got this , it seems like the storage map '-v ~/.ollama/models:/root/models' is wrong and the ipexllm works fine. |
it is weird , i reproduce the problem again. Is there any difference between your op and mine? |
this time i directly pull the qwen2.5 without map the |
Do you still meet error? I used your script to start Docker on root@calico-B450M-HDV-R4-0 and followed the steps below to run and got normal results. |
Okay, now I know whats the point😳. Actually it might be casued by the command execution in container:
it i don't execute the command, and directly start ollama in ipex container , it works well. |
Thanks for your feedback. And we did not encounter any problems on ARC770 with Intel CPU. Maybe it was caused by AMD CPU. |
On the Host, i could use ollama and ipex with a Arc750 GPU but,
In the container, i got a fail , the step is :
0.start the container
output:
The text was updated successfully, but these errors were encountered: