Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

inference.py:推理过程如何改用cpu进行 #32

Open
Yanlewen opened this issue Jul 24, 2024 · 1 comment
Open

inference.py:推理过程如何改用cpu进行 #32

Yanlewen opened this issue Jul 24, 2024 · 1 comment
Labels
question Further information is requested

Comments

@Yanlewen
Copy link

Question

我想使用cpu运行inference.py,但一直报错,除了采用规定CUDA_VISIBLE_DEVICES=-1的方式,是否有其他方式可以直接更改相关代码从而能够在cpu上正常运行。
这是现在的报错:RuntimeError: CUDA error: out of memory
CUDA kernel errors might be asynchronously reported at some other API call,so the stacktrace below might be incorrect.
For debugging consider passing CUDA_LAUNCH_BLOCKING=1.

补充信息

No response

@Yanlewen Yanlewen added the question Further information is requested label Jul 24, 2024
@xiuqhou
Copy link
Owner

xiuqhou commented Jul 24, 2024

指定CUDA_VISIBLE_DEVICES=-1正常会采用CPU进行推理,我这里测试是没问题的。报错信息的显示GPU显存不够,好像用的还是GPU推理代码,是不是CUDA_VISIBLE_DEVICES写错了?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants