Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Need Info][ONNXRUNTIME-SERVER] - Input JSON format for onnxruntime server #1628

Closed
ac4922 opened this issue Aug 15, 2019 · 2 comments
Closed

Comments

@ac4922
Copy link

ac4922 commented Aug 15, 2019

I am using onnx-runtime server and provided the model.onnx.

Now when to trying to get the inference, I am struggling with the I/P JSON payload format which should be given to get the inference.

Could you please help out with an example request playload (in json) for onnxruntime server api?

We are trying to achieve this :

a = [ 889., 1614.,    0.,    0.,    0.,    0.,    0.,    0.,    0., 0.,    0.,    5.,    3.,    0.,    0.,    0.,    0.,    0.,3.,    0.]
sess.run(None ,{"features": np.array(a).astype(numpy.float32)})

Any help would be appreciated.

Thank you :)

@faxu faxu added the question label Aug 15, 2019
@NonStatic2014
Copy link
Contributor

The input json schema is defined in the proto file.

If you are looking for a real sample, please refer to this integration test data

@faxu
Copy link
Contributor

faxu commented Aug 22, 2019

@ac4922 does this resolve your question? If so, please close the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants