Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

inference time #3

Open
Ghustwb opened this issue Sep 6, 2018 · 2 comments
Open

inference time #3

Ghustwb opened this issue Sep 6, 2018 · 2 comments

Comments

@Ghustwb
Copy link

Ghustwb commented Sep 6, 2018

Thanks for your work, I works on tx2.But I think the inference time is not normal,MobileNetV1-ssd can run 38pfs on Jetson TX2,Tiny-DSOD only 14fps.
Is it wrong with my operation?
Thanks

@ujsyehao
Copy link

Can you post your experiment record?

@kunalgoyal9
Copy link

This is a similar issue I am facing. For Tensorflow Object detection API for the same image Time taken is around 0.034secs(~30fps) but the same image on Tiny DSOD taking 0.37(~2.7fps).

PS: All cores are being used.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants