-
Notifications
You must be signed in to change notification settings - Fork 915
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
H264 stream from MotionEye? #1259
Comments
motionEye can only stream mjpeg (it's in fact motion that only streams mjpeg). However movies (that are saved to storage) are actually created using efficient codecs (such as h264). How does your Synology store the mjpeg stream? I doubt it stores it as a series of jpegs - that would be wrong for many reasons. |
I really like motionEye but the problem is that some devices don't support mjpeg (TV, satellite receiver...). |
Would it be possible to add a "Fast Network camera" to motioneyeos for all cameras like this:
The motion detection could then be running motioneye (or, if a user prefers, zoneminder/shinobi/kerberos.io etc). on a more powerful computer in the network. |
I would pay for that |
I am going to take a shot at it in this branch https://github.com/fawick/motioneyeos/tree/fast_network_webcam When I have something worth showing, I'll send a PR. @ccrisan I might need some advise on how to to best put controls for this in the settings GUI at some point later. |
@fawick great initiative. I'm not sure how well such a setup would perform, but it's probably better than plain motion and having h264 streams coming out of motionEyeOS would indeed be a win. As you probably already know, the FNC mode turns down motion completely and runs streamEye instead. streamEye simply streams MJPEG generated by raspimjpeg.py, which could in theory be replaced by the recent official The UI part is implemented in streameyectl.py and the Now, in theory, one could write a drop-in replacement for the With regards to Either way, I'm more than happy to support this feature in any way that I can. |
According to https://trac.ffmpeg.org/wiki/ffserver, ffserver is about to be deprecated! Further documentation reading on https://www.ffmpeg.org/ffmpeg-protocols.html#rtsp makes me think that this could be implemented using regular ffmpeg |
@ccrisan Thanks for the great feedback. Is my understanding correct, that streamEye was written with the CSI camera in mind and is not cacable to read from a v4l2 source? My main motivation for working on this issue is that for my Pi B+/USB webcam the motion daemon (even when no motion detection was enabled on the Pi to which the camera was connected) can only obtain 3-4 FPS, while I have no problem encoding up 15 FPS with FFMPEG and h264_omx on the same board in real-time and a CPU load of around 30%. I understand that your experiments showed that using FFMPEG instead of motion made things even worse, but I assume that was CPU-based MJPEG encoding instead of hardware-accelerated h264, right? I had the chance to play a bit with the @d-faure: Thanks for the heads-up regarding the deprecation. I am quite relieved, actually, as Maybe RTSP is not the optimal protocol for this purpose, but I am not aware of another alternative. I'm happy to take suggestions, though. :-) (*) Kudos to you, @ccrisan, the motioneyeos/thingos buildchain is absolutely fantastic and working with it was the most pleasant experience I had in software engineering for a long time. I feared that simply building |
Has anyone of you tinkered with MPEG-DASH before? Or HLS? |
@d-faure H264 RTSP can be implemented using raspivid and ffmpeg. I've done it before. Something like this:
You can replace Disclaimer: I whipped this code up quickly without any testing, just to give you an idea of how it can be done. |
@fawick ffmpeg on its own is only able to handle a single broadcast destination... using the RSTP protocol. @jasaw I'd love to have this kind of stream available in Streameye |
Yes, you'll need the RTSP server running on the FNC device to relay the stream. |
Perhaps a more reliable RSTP server here: https://github.com/kzkysdjpn/rtsp-server |
@jasaw, @d-faure Thanks for your tipps. I indeed successfully streamed h264 at 30fps from the USB webcam connected to the Pi (w/ USB webcam, hardware-accelerated) to a linux/amd64 machine with these commands # on the linux/amd64 machine (with IP address 172.16.42.1)
ffmpeg -rtsp_flags listen -i rtsp:///172.16.42.10:5454 -f null /dev/null
# on the Pi (usually running motioneye)
/etc/init.d/S85motioneye stop # if one wants to reproduce this with a unmodified motioneye instance
ffmpeg -s 640x480 -i /dev/video0 -c:v h264_omx -f rtsp rtsp://172.16.42.1:5454 On the Pi,
For 15fps, the CPU load was around 35%, and for 5fps 32%. I'd So next step would be to try out the suggested rtsp-servers on the board and see whether I can fit them into the motioneyeos ecosystem. |
Not necessarily. streamEye was designed with no particular source in mind and reads its frames from standard input. I've successfully streamed from a V4L2 camera with ffmpeg + streamEye. Guys, I would appreciate if we didn't add perl as dependency to motionEyeOS. Also is it only me or having a RTSP server written in perl these days sounds a bit strange? |
@ccrisan Thanks for clarifying the streamEye design. Perl sure does sound oldfashioned. :-) I'd rather find a rstp-server in Go, actually. Although Python would probably suit the rest of motioneye most. |
I was wondering, as we are looking for a way to broadcast an hardware already-encoded video stream, if ffmpeg was the right tool to do that.
But at least, I think I found a better RTSP relay server candidate: |
The h264 stream is "already hardware-accelerated" only if the Raspberry Pi CSI camera and Thanks for finding the RTSP server alternative, I am going to check them out. |
If the GStreamer based RTSP server is too heavy for the host, a lighter solution could be to use live555 as described here: https://emtunc.org/blog/02/2016/setting-rtsp-relay-live555-proxy/ |
I used |
|
Some time ago I also played around successfully with H264 v4l2 RTSP Server from MPromonet. It‘s very versatile as it can grab H264 streams as well as JPEG streams from v4l2 USB devices for example as well as two video sources simultanous or audio streams. For sources only supporting uncompressed pixel formats the author also provides some tools for encoding Thema to H264 (hardware accellerated on Raspberry) and mirroring them to a virtual v4l2 device. So perhaps it‘s worth considering them for implementing in MotionEye(OS)? Implementing RTSP/H264 streaming would be a Real performance amplifier for MEye so I would much appreciate it whilest I‘m unfortunately not a coder/programmer myself. |
Hi All, i too use motioneyeos on my Raspberry PI together with my synology nas (surveillance station) to record videos. First of all i have to admit that installation of motioneyeos itself and integration into my synology was flawless (thumbs up). BUT, as the OP has stated the recorded files are really large. To provide the discussion some numbers i have recoded my recorded files via handbrake to each a H264 and a H265 encoded file. Furthermore i threw all into a spread sheet and made some calculations. Here are the results:
The orignal stream was setup with the resoultion of 1280x720 and ~12FPS. Each file was a recording of ~30min. The whole 'night' of recording is in total ~140GB. As you can see i have also taken h265 into account - i was just curious how big the difference would be between h264 and h265. Of course it would be interesting to know if the effort to implement h265 is the same as h264, or if the implementation of h265 can benefit from the h264 - but that's just curiosity. Coming back to the OP's topic. When using motioneyeos with a synology, it seems that the only way to save a lot of space, is to export the recorded videos to a puplic share (from surveillance station) recode them and import them back. |
@mikedolx so what would you propose? |
Do we need rtsp output for this? For modern browsers HTML5, wouldn't webm(vp8), mp4(h264) or ogg(theora) files be the best? I have been fighting with ffmpeg for a while, live-converting from an rtsp input stream to a file on disk and accessing that file via the browser, embedded in a html5 video tag. However: ogg/theora: choppy playback in some browsers for me, generally weird playback behaviour, acceptable transcoding CPU load webm/vp9: worked perfectly in the browser, but with no encoding hardware acceleration available, it's just not feasible (massive CPU load) mp4/h264: almost zero CPU load since you can just use c:v copy for standard rtsp camera streams. HOWEVER, apparently mp4 is a tricky container format to get live-streaming right, I could never get it to behave like you would expect a live-stream of your camera to behave. It probably involves building/shuffling an index around in real-time, and constantly chopping "old" live stream data off the front of the mp4 file. But all my googling and trying around so far has been unsuccessful. |
@fawick Did you get anywhere with enabling RTSP stream? I managed to cross compile
@ccrisan Any thoughts? As a bonus with |
I found a better way, which might be workable.
raspivid does the H264 encoding, pipes it to Live555 RTSP server which can serve multiple clients simultaneously. To test this yourself, follow these steps:
|
This is a much better solution:
Simultaneous H264/RTSP streams to multiple clients.
This works well as long as CPU is not under heavy load, e.g. keep mjpeg frame rate low to minimize CPU load as this is used for web front end only. @ccrisan @fawick @d-faure I would be happy to run with this set up. What do you guys think? |
This is the best option by far that supports live audio and video stream over RTSP. It reads from V4L2 video device, uses OMX encoder, so technically we can use RPi-Cam or any USB webcam. It also reads audio from an alsa device, so it supports any USB mic or audio card. GStreamer RTSP server is able to serve simultaneous RTSP streams to multiple clients. Motion reads the RTSP client and converts it to mjpeg and motionEye displays the mjpeg on the web front end.
We could remove motion from the pipeline if we add RTSP streaming support to motionEye. WebRTC looks quite suitable, so probably transcode RTSP to WebRTC, but I'm no expert in web stuff. Here's what I did:
This is the CPU and memory usage running on my Raspberry Pi 2, 1280 x 720 resolution, 20 fps.
|
I've evaluated several approaches and here's a brief summary:
@ccrisan Which option do you prefer? Personally, I would go for the gstreamer option. Implementing this feature is looking like a big task, so we probably need to split it into several stages, and I'm going to need your help too, especially on motionEye side of things like user interface and configuration. I suggest we split the work into 3 stages.
|
Just wanted to say I am looking forward to your guys' progress on this. I have been looking for a way to stream h264 from the pi to zoneminder. |
Yet another option. This option assumes all V4L2 video devices support h264 output. I think modern webcams and bcm2835-v4l2 driver support h264 output. Audio can be any ALSA device. One instance of GStreamer acts as an RTSP server which can serve multiple RTSP clients. To minimize changes to motionEye, we use another instance of GStreamer as RTSP client, converting h264 to jpeg, pipe to StreamEye and motionEye gets the jpeg stream from port 8081. This 2nd instance of GStreamer can be removed in the future if motionEye supports RTSP streams, probably using webRTC.
Steps to prepare the required software:
Steps to start the H264 live stream:
This is the CPU and memory usage running on a Raspberry Pi 2 with 1280 x 720 resolution at 20 fps:
|
Basic RTSP support has been implemented in dev branch. |
How do you use a motioneye camera with a Synology NAS? |
I have motioneyeos installed on my Raspberry Pi 3. Can motioneye generate a h264 stream or only a mjpeg steam? I'm using motioneye os as a vast network camera and record using my synology nas but the files are huge because of mjpeg.
The text was updated successfully, but these errors were encountered: