Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider AI hardware sticks #15

Closed
roguedarkjedi opened this issue May 16, 2020 · 3 comments
Closed

Consider AI hardware sticks #15

roguedarkjedi opened this issue May 16, 2020 · 3 comments
Assignees
Labels
documentation Improvements or additions to documentation enhancement New feature or request question Further information is requested

Comments

@roguedarkjedi
Copy link
Owner

roguedarkjedi commented May 16, 2020

So while this code is fairly optimized for what it is, there's the very clear and obvious problem that this code base can only process at 1 frame every 2 seconds (essentially running at .5fps). This is with the the full processing power and OpenCV optimizations which don't necessarily target the dnn module (which we use almost exclusively).

This leads into the problem of while we can be as fast as we can be, we'll always be processing dog images very slowly.

I believe I've hit the upperbounds of what the Pi can do. I don't think I can process this any better. So after some investigations, it might be worth considering investing in an AI processor expansion.

image

@roguedarkjedi
Copy link
Owner Author

roguedarkjedi commented May 16, 2020

Coral USB Board

https://coral.ai/products/accelerator/
https://coral.ai/docs/accelerator/get-started/#requirements

Pros:

  • Extremely fast. In theory we'll be getting massive speed increases
  • Will get us to about 20+fps
  • Works out of the box with tensorflow
  • Little additional setup needed
  • Only $60

Cons:

  • Cannot use OpenCV, will need to convert any image formats
  • Must use a different AI code flow model as existing code is OpenCV.
  • Must recompile the tensorflow models to the lite format (however the one for our current model can be found here)
  • Not necessarily built for Raspberry Pi (though this might not be too big of an issue)
  • Runs hot, risk of burning
  • May need to consider cooling methods later.

Notes:

https://www.tensorflow.org/lite/guide/inference#load_and_run_a_model_in_python
https://medium.com/@aallan/hands-on-with-the-coral-usb-accelerator-a37fcb323553
https://www.pyimagesearch.com/2019/04/22/getting-started-with-google-corals-tpu-usb-accelerator/
https://www.pyimagesearch.com/2019/05/13/object-detection-and-image-classification-with-google-coral-usb-accelerator/

@roguedarkjedi
Copy link
Owner Author

roguedarkjedi commented May 16, 2020

@roguedarkjedi roguedarkjedi added documentation Improvements or additions to documentation enhancement New feature or request question Further information is requested labels May 17, 2020
@roguedarkjedi roguedarkjedi self-assigned this May 17, 2020
@roguedarkjedi
Copy link
Owner Author

This is implemented and works great. Closing this issue, but will pin it because the notes are good.

@roguedarkjedi roguedarkjedi pinned this issue Jul 4, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation enhancement New feature or request question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant