-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Need Info][ONNXRUNTIME-SERVER] - Input JSON format for onnxruntime server #1628
Comments
The input json schema is defined in the proto file. If you are looking for a real sample, please refer to this integration test data |
@ac4922 does this resolve your question? If so, please close the issue. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I am using onnx-runtime server and provided the model.onnx.
Now when to trying to get the inference, I am struggling with the I/P JSON payload format which should be given to get the inference.
Could you please help out with an example request playload (in json) for onnxruntime server api?
We are trying to achieve this :
Any help would be appreciated.
Thank you :)
The text was updated successfully, but these errors were encountered: