-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
set GPU #9644
Comments
It's not possible for us to guess the error you've encountered. Can you paste it here? |
self.providers = [('CUDAExecutionProvider', { and run then here is the errors: Invoked with: <onnxruntime.capi.onnxruntime_pybind11_state.InferenceSession object at 0x7fd1b2ae11b8>, ['CUDAExecutionProvider'], [{'device_id': 1, 'gpu_mem_limit': 2147483648}] i saw the doc in offcial link as https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html: init InferenceSession exception |
I tried the exact same program and don't see any issue.
|
@pranavsharma my onnxruntime version is 1.6.0 GPU,I really test my code as your advices |
Please use the latest version of ORT. |
my cuda is 10.2 and it only need 1.6.0 as official suggestion,and i build the master branch code with cuda 10.2,but error as issues #9631 |
Unfortunately the documentation at this link points to the latest version. The following should work for 1.6.0.
|
Describe the bug
A clear and concise description of what the bug is. To avoid repetition please make sure this is not one of the known issues mentioned on the respective release page.
Urgency
If there are particular important use cases blocked by this or strict project-related timelines, please share more information and dates. If there are no hard deadlines, please specify none.
System information
i set gpu by the way as https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html:
providers = [
('CUDAExecutionProvider', {
'device_id': 0,
'arena_extend_strategy': 'kNextPowerOfTwo',
'gpu_mem_limit': 2 * 1024 * 1024 * 1024,
'cudnn_conv_algo_search': 'EXHAUSTIVE',
'do_copy_in_default_stream': True,
}),
'CPUExecutionProvider',
]
session = ort.InferenceSession(model_path, providers=providers)
but there is an errors:
Invoked with: <onnxruntime.capi.onnxruntime_pybind11_state.InferenceSession object at 0x7fe93d9badc0>, [('CUDAExecutionProvider', {'device_id': 0, 'arena_extend_strategy': 'kNextPowerOfTwo', 'gpu_mem_limit': 2147483648, 'cudnn_conv_algo_search': 'EXHAUSTIVE', 'do_copy_in_default_stream': True}), 'CPUExecutionProvider'], []
The text was updated successfully, but these errors were encountered: