Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Raspberry Pi AI Kit with Hailo #11961

Closed
askpatrickw opened this issue Jun 14, 2024 · 13 comments
Closed

Support Raspberry Pi AI Kit with Hailo #11961

askpatrickw opened this issue Jun 14, 2024 · 13 comments
Labels
enhancement New feature or request

Comments

@askpatrickw
Copy link

Describe what you are trying to accomplish and why in non technical terms
Would like to use the new official RPI AI Kit & Hailo with Frigate.

Describe the solution you'd like
Add support for the Hailo as a detector.

https://www.raspberrypi.com/documentation/accessories/ai-kit.html#object-detection

Describe alternatives you've considered
Can't get the Pineberry Ai board and Coral to work reliably, mostly due to the Coral libaries.

Additional context
Google seems to have abandoned the Coral libraries, so more a more diverse set of boards would be ideal.

@askpatrickw askpatrickw added the enhancement New feature or request label Jun 14, 2024
@NickM-27
Copy link
Sponsor Collaborator

See #11663 (comment)

@NickM-27 NickM-27 closed this as not planned Won't fix, can't repro, duplicate, stale Jun 14, 2024
@askpatrickw
Copy link
Author

Not sure I would consider this device to be "niche". But let's see if any other Frigate users find and comment on this issue (or open dupes).

@NickM-27
Copy link
Sponsor Collaborator

Relative to the highly utilized detectors like coral and Intel openvino it is. Even tensorrt is relatively niche (< 10% of total users)

@NickM-27
Copy link
Sponsor Collaborator

And the rpi in general is not a recommended platform for frigate, especially rpi5 which is missing hardware h264 decoders

@geerlingguy
Copy link

geerlingguy commented Jun 17, 2024

(Just a note: The AI Kit has only been available for a couple weeks, and is only just now showing up in physical retail locations. Before its existence there maybe were like 2 people on the planet I've heard of using the Hailo.)

I've been running two Frigate instances on two Pi 5's for a few months now, one I've tested up to 3 4K Annke cameras just fine—they output H.265 (the feed I use for detection is 480p, and I record 24x7 full 4K stream). The other is pulling in two H.264 streams from Hikvision cameras, and I've had zero issues on either. I've moved the 2nd one over to a CM4 just for convenience getting it into my rack though.

Running USB Corals currently but I hope to use the Hailo-8L eventually :)

I know the Hailo devs haven't been focused as much on opening up access to their dev tooling until very recently, but I think they're keen on getting the Hailo-8L working great with Frigate too. It would be awesome if I could walk into Micro Center, walk out with a $130 little Pi+AI kit, and have a perfect Frigate box for 2-4 cameras (maybe more? My systems aren't even hitting 50% CPU).

@NickM-27
Copy link
Sponsor Collaborator

To be clear, closing this feature request just means I don't think this support is something that Frigate maintainers will work on because it will affect so few users, when there are literally hundreds of feature requests that will affect all or a majority of frigate users.

In any case, frigate has a community supported boards framework so should the Hailo devs or anyone else want to contribute and maintain the detector code then this would be supported.

@geerlingguy
Copy link

Totally get that; just on the "affect so few users" point, it remains to be seen whether the AI Kit will spark the Pi + Hailo to be a widely-available and widely-supported solution for low-power inference, or if Hailo and/or RPi will drop the ball and make it a flop.

In the former case, I could see the combo to be as popular as (if not more popular than) USB Coral + Pi, as builds can be a lot more compact with a little flat PCIe HAT.

@NickM-27
Copy link
Sponsor Collaborator

NickM-27 commented Jun 17, 2024

right, such options are already possible with another community supported board RKNN in things like OrangePi for example. Those also have much more robust hardware for encoding / decoding though

@n2kbg
Copy link

n2kbg commented Jun 28, 2024

Just another perspective- I found this issue because I was searching to see how soon Frigate would support the Pi AI kit. I was going to buy one to replace my NUC/Coral system. I was not expecting the answer to be "never".

@NickM-27
Copy link
Sponsor Collaborator

NickM-27 commented Jun 28, 2024

I don't know why you are interpreting the answer as "never", there are already a handful of community supported detectors and multiple people have expressed interest in creating a community supported detector for this hardware

I also reiterated this in the above comment #11961 (comment)

@victorhooi
Copy link

Just putting this for posterity, in case other people find this thread.

It seems like recently (i.e. July 2024) - support for Hailo-8L was added - yay! 😀:

The commit mentions support for the "Raspberry Pi 5 with Hailo-8L AI Kit". (I assume they haven't tested with just using the Hailo-8L M.2 card on other PCs).

It also mentions inference speeds are 17-21 ms for the SSD MobileNet Version 1 model. - I believe the Coral TPU is around 10ms, right?

I'm not sure if that's some kind of fundamental limitation of the Hailo-8L, or something due to the models being used, or something that can be optimised for?

@NickM-27
Copy link
Sponsor Collaborator

I implemented support for amd64 in a commit after that. It's possible the model used is not quantized, difficult to directly compare them.

@victorhooi
Copy link

Ahh - nice!

I did find this from Hailo's docs:

https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/OPTIMIZATION.rst

From what I can see, we're just downloading the pre-compiled models provided by Hailo from S3:

https://hailo-model-zoo.s3.eu-west-2.amazonaws.com/ModelZoo/Compiled/v2.11.0/hailo8l/ssd_mobilenet_v1.he

I would assume they've already run the optimize step on these public models - but I suppose it's worth checking

(This comment seems to imply they are doing it - although maybe that's how I'm reading it?)

I don't Hailo-8L hardware currently - but maybe it's something I can get, and compare it against my existing Coral TPUs =).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants