-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Output problem with ONNX inference #80
Comments
traditionally, Yolo repositories have the dirtiest code |
Yes (Although not only the yolo ones. But the IA ones). I'm not even asking about the code. Just how to use it. |
are you serious? how are you going to use it without understanding the code? |
Why would I need to actually understand this particular code JUST to use it? You only need to know what's the input, and what's the output. Everything else is unnecesary for the sole purpose of using the model. For now, I know whats the input, so, I do the inference. But I don't know how to interpret the output. I am surprised that this information isn't always the first thing displayed on the main page of the repositories. |
lmao ok |
I have the same problem. |
5 for bbox (x_min, y_min, x_max, y_max, confidence), 1 for class, 15 for landmarks (x_1, y_1, landmark_confidence_1, ...). Not sure about order. |
The output tensor is tensor: float32[1,25200,21]. What is the meaning of the 21 values for each detection? I can't find the information, and without that, I can't use the model. Pls
The text was updated successfully, but these errors were encountered: