-
-
Notifications
You must be signed in to change notification settings - Fork 6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nvidia Triton Inference Server #733
Comments
Triton server inference is not yet supported on ultralytics repo. We'll work on it soon |
👋 Hello there! We wanted to give you a friendly reminder that this issue has not had any recent activity and may be closed soon, but don't worry - you can always reopen it if needed. If you still have any questions or concerns, please feel free to let us know how we can help. For additional resources and information, please see the links below:
Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed! Thank you for your contributions to YOLO 🚀 and Vision AI ⭐ |
I'm also facing the same issue, do you have any updates or a timeline for such implementation? *https://github.com/dinhkt/triton-trt-object-detection |
@AyushExel @mltoml triton server is actually in our supported AutoBackend formats, but this is inherited from YOLOv5 as part of our work with OctoML and not tested yet for YOLOv8. |
yeah @gaziqbal did a very nice work integrating YOLOv5 detect with triton*, hopefully we see that replicated in YOLOv8 soon. |
@AyushExel @glenn-jocher Would it be possible to reopen this issue again? |
@mltoml certainly! We apologize for any confusion - this issue can absolutely be reopened if you have further questions or if an update to the status of Triton support is desired. As Glenn mentioned previously, Triton server inference is supported as part of our work with OctoML and YOLOv5, but it has not been tested yet for YOLOv8. However, we are always looking to expand and improve the capabilities of our software, so we encourage you to stay tuned for future updates! In the meantime, please feel free to check out the Ultralytics Docs and HUB for additional resources, and don't hesitate to ask any further questions you may have. |
@glenn-jocher may i know if the inference is available for yolov8 yet? as i don't see any further comment for this thread. |
Yes we have a PR open for this now , please check PR section and leave any comment in the Triton Server PR. |
Search before asking
Question
Could you please share an implementation of yolov8 to run on Nvidia Triton Inference Server?
Additional
No response
The text was updated successfully, but these errors were encountered: