-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Raspberry Pi AI Kit with Hailo #11961
Comments
See #11663 (comment) |
Not sure I would consider this device to be "niche". But let's see if any other Frigate users find and comment on this issue (or open dupes). |
Relative to the highly utilized detectors like coral and Intel openvino it is. Even tensorrt is relatively niche (< 10% of total users) |
And the rpi in general is not a recommended platform for frigate, especially rpi5 which is missing hardware h264 decoders |
(Just a note: The AI Kit has only been available for a couple weeks, and is only just now showing up in physical retail locations. Before its existence there maybe were like 2 people on the planet I've heard of using the Hailo.) I've been running two Frigate instances on two Pi 5's for a few months now, one I've tested up to 3 4K Annke cameras just fine—they output H.265 (the feed I use for detection is 480p, and I record 24x7 full 4K stream). The other is pulling in two H.264 streams from Hikvision cameras, and I've had zero issues on either. I've moved the 2nd one over to a CM4 just for convenience getting it into my rack though. Running USB Corals currently but I hope to use the Hailo-8L eventually :) I know the Hailo devs haven't been focused as much on opening up access to their dev tooling until very recently, but I think they're keen on getting the Hailo-8L working great with Frigate too. It would be awesome if I could walk into Micro Center, walk out with a $130 little Pi+AI kit, and have a perfect Frigate box for 2-4 cameras (maybe more? My systems aren't even hitting 50% CPU). |
To be clear, closing this feature request just means I don't think this support is something that Frigate maintainers will work on because it will affect so few users, when there are literally hundreds of feature requests that will affect all or a majority of frigate users. In any case, frigate has a community supported boards framework so should the Hailo devs or anyone else want to contribute and maintain the detector code then this would be supported. |
Totally get that; just on the "affect so few users" point, it remains to be seen whether the AI Kit will spark the Pi + Hailo to be a widely-available and widely-supported solution for low-power inference, or if Hailo and/or RPi will drop the ball and make it a flop. In the former case, I could see the combo to be as popular as (if not more popular than) USB Coral + Pi, as builds can be a lot more compact with a little flat PCIe HAT. |
right, such options are already possible with another community supported board RKNN in things like OrangePi for example. Those also have much more robust hardware for encoding / decoding though |
Just another perspective- I found this issue because I was searching to see how soon Frigate would support the Pi AI kit. I was going to buy one to replace my NUC/Coral system. I was not expecting the answer to be "never". |
I don't know why you are interpreting the answer as "never", there are already a handful of community supported detectors and multiple people have expressed interest in creating a community supported detector for this hardware I also reiterated this in the above comment #11961 (comment) |
Just putting this for posterity, in case other people find this thread. It seems like recently (i.e. July 2024) - support for Hailo-8L was added - yay! 😀: The commit mentions support for the "Raspberry Pi 5 with Hailo-8L AI Kit". (I assume they haven't tested with just using the Hailo-8L M.2 card on other PCs). It also mentions inference speeds are I'm not sure if that's some kind of fundamental limitation of the Hailo-8L, or something due to the models being used, or something that can be optimised for? |
I implemented support for amd64 in a commit after that. It's possible the model used is not quantized, difficult to directly compare them. |
Ahh - nice! I did find this from Hailo's docs: https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/OPTIMIZATION.rst From what I can see, we're just downloading the pre-compiled models provided by Hailo from S3: I would assume they've already run the (This comment seems to imply they are doing it - although maybe that's how I'm reading it?) I don't Hailo-8L hardware currently - but maybe it's something I can get, and compare it against my existing Coral TPUs =). |
Describe what you are trying to accomplish and why in non technical terms
Would like to use the new official RPI AI Kit & Hailo with Frigate.
Describe the solution you'd like
Add support for the Hailo as a detector.
https://www.raspberrypi.com/documentation/accessories/ai-kit.html#object-detection
Describe alternatives you've considered
Can't get the Pineberry Ai board and Coral to work reliably, mostly due to the Coral libaries.
Additional context
Google seems to have abandoned the Coral libraries, so more a more diverse set of boards would be ideal.
The text was updated successfully, but these errors were encountered: