Replies: 2 comments
-
Hi, the general direction of camera stacks is to run more open source code on the Arm processors which unfortunately makes this hard to avoid. It may help somewhat to set the main stream format to "YUV420" as this will save both memory and memory bandwidth, otherwise you may need to consider continuing with the legacy camera stack. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Thanks David, that's understood. "YUV420" doesn't seem to have a noticeable effect. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I've posted a similar Q to Raspberry Pi forum (https://forums.raspberrypi.com/viewtopic.php?p=2086996#p2086996) but could belong here as well, not sure.
I just installed the latest Bullseye LITE on RPI Zero v1.3 and narrowed down my testing to the below simple code that just doesn't do anything apart from starting h264 capture but without storing the data anywhere. It spins up 6 processes for this (as per
htop
) and the CPU utilization goes to 42%-70%, with a default framerate (25 or 30?) to 85%.There must be something wrong. Buster and
picamera
with such code but extended to actually stream the data to the network used something like 18% CPU.I must have overlooked something.
Beta Was this translation helpful? Give feedback.
All reactions