-
Notifications
You must be signed in to change notification settings - Fork 154
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Streaming ideas #126
Comments
In fact the will be the problem of missing moov atoms so not so easy |
mp4 roughly is a bundle with raw data heap and dictionary to lookup for frames in that heap. The latter (generally) is written at the end of mp4 so frames are not available until mp4 is fully written. Our camera is of the case so this method is void. |
Ok, got a poorman mjpeg streamer for this hack idea.
--[[
Copyright 2018 Vladimir Dronnikov
GPL
]]
local NAL_START = "\x00\x00\x00\x01"
local SPS_PPS_640X360 = "\x00\x00\x00\x01\x67\x64\x00\x1E\xAD\x84\x01\x0C\x20\x08\x61\x00\x43\x08\x02\x18\x40\x10\xC2\x00\x84\x3B\x50\x50\x17\xFC\xB3\x70\x10\x10\x10\x20\x00\x00\x00\x01\x68\xEE\x3C\xB0"
local SPS_PPS_1280X720 = "\x00\x00\x00\x01\x67\x64\x00\x1F\xAD\x84\x01\x0C\x20\x08\x61\x00\x43\x08\x02\x18\x40\x10\xC2\x00\x84\x3B\x50\x28\x02\xDD\x37\x01\x01\x01\x02\x00\x00\x00\x01\x68\xEE\x3C\xB0"
local process = function()
local buf = ""
while true do
-- buffer too grown, flush and resync
if #buf > 2000000 then buf = "" end
local dat = io.read(4096)
if not dat then break end
buf = buf .. dat
-- find IDR signature
local beg = buf:find("\x65\xb8", 1, true)
if beg then
if beg > 4 then
-- io.stderr:write(("BEG @%d %02X%02X%02X%02X%02X%02X\n"):format(beg, buf:byte(beg-4), buf:byte(beg-3), buf:byte(beg-2), buf:byte(beg-1), buf:byte(beg), buf:byte(beg+1)))
if buf:byte(beg - 4) == 0x00 and buf:byte(beg - 3) == 0x00 then
buf = buf:sub(beg - 4)
-- get NAL unit size
local size = ((buf:byte(1) * 256 + buf:byte(2)) * 256 + buf:byte(3)) * 256 + buf:byte(4)
-- io.stderr:write(("SIZE %d %d\n"):format(size, #buf))
-- wait for buffer to contain the whole NALU
if #buf >= size + 4 then
frame = buf:sub(5, size + 4)
-- io.stderr:write(("FRAME %d %d\n"):format(size, #frame))
-- TODO: FIXME: daylight size of 1280x720 > 70000
if size < 60000 then
-- convert frame to JPEG and dump it to stdout
local fp = io.popen("ffmpeg -y -loglevel fatal -f h264 -i - -f image2pipe -", "w")
fp:write(SPS_PPS_640X360)
fp:write(NAL_START)
fp:write(frame)
fp:flush()
fp:close()
end
buf = buf:sub(size + 4)
end
else
buf = buf:sub(beg + 2)
end
end
end
end
end
process()
|
Another approach is to stream the content of For tests I made the To stream 320x192 YUV420P buffer
To stream h264NB: sync broken
Experiment:
The main task is to determine @shadow-1 ^^^ |
@dvv just saw your excellent experiment! Check out the previous update I posted in another thread. |
@andy2301 IIRC initially I strace-d mp4record and found boundaries analysing mmap calls addresses. To say the truth, I got tired with that. My task was to detect motion (so no need in h264 at all), and firstly I moved to use 320x192 YUV420P buffer and then decided to use ad-hoc PIR sensor. I recall my latest thought on that was:
analyse numbers under "-----STREAM BUFFER-------------------------------------------------------------" They should point to base+head/tail. |
@dvv Got it. Thanks. I tried out your streaming idea "To stream 320x192 YUV420P buffer" and "To stream h264", but only got pictures full of mosaics. I suspect that I need to specify some different parameters for the "grab" tool, as my stream buffer's base address seems to be different, as shown below with
I have two questions:
Thanks again! Update: But I still have the two questions. Especially for question 1, how was the magic number 19268 determined? I understand that 0x00004B44 is 19268 (starting offset of the 720p region), and [0, 19268] might be the block with control info. But how did you get to know this 0x00004B44 number? Update2: |
I managed to compile strace and did a strace on It looks like
|
On YUVTo know regions for YUV do $ cat /proc/umap/vb You'll see kinda:
You'll then want to
NB: for just motion detection purpose one may drop V-component of YUV, so the command above may transform to:
|
grab.c// Copyright 2018 Vladimir Dronnikov <dronnikov@gmail.com>
// GPL
#include <stdio.h>
#include <stdlib.h>
#include <unistd.h>
#include <errno.h>
#include <fcntl.h>
#include <sys/mman.h>
#define HD_STREAM_START 0x8178c800
#define HD_STREAM_SIZE 1382400 // (1280 * 720 * 3)
#define SD_STREAM_START 0x81a31000
#define SD_STREAM_SIZE 345600 // (640 * 360 * 3)
#define JP_STREAM_START 0x81FC3000
#define JP_STREAM_SIZE (0x81FD5000 - 0x81FC3000) // (61440)
#define STREAM_START JP_STREAM_START
#define STREAM_SIZE JP_STREAM_SIZE
static ssize_t full_write(int fd, const void *buf, size_t len)
{
ssize_t cc;
ssize_t total;
total = 0;
while (len) {
for (;;) {
cc = write(fd, buf, len);
if (cc >= 0 || EINTR != errno) {
break;
}
errno = 0;
}
if (cc < 0) {
if (total) {
return total;
}
return cc;
}
total += cc;
buf = ((const char *)buf) + cc;
len -= cc;
}
return total;
}
int main(int argc, char *argv[])
{
if (argc < 6) {
fprintf(stderr, "grab FILE START SKIP +SIZE|END DELAY-MICROSECONDS\n");
exit(1);
}
off_t start = strtoul(argv[2], NULL, 0);
off_t skip = strtoul(argv[3], NULL, 0);
size_t size;
if (argv[4][0] == '+') {
size = strtoul(argv[4], NULL, 0);
} else {
size = strtoul(argv[4], NULL, 0) - start + 1;
}
unsigned long delay = strtoul(argv[5], NULL, 0);
fprintf(stderr, "mmap %s from 0x%08lx+0x%08lx, %ld bytes, delay %ld microseconds\n", argv[1], start, skip, size, delay);
int fd = open(argv[1], O_RDONLY);
if (fd < 0) {
perror("open");
exit(2);
}
void *map = mmap(NULL, size, PROT_READ, MAP_PRIVATE, fd, start);
if (MAP_FAILED == map) {
perror("mmap");
exit(4);
}
while (1) {
if (-1 == full_write(1, map + skip, size - skip)) {
perror("write");
break;
}
write(2, ".", 1);
if (delay <= 0) break;
usleep(delay);
}
munmap(map, size);
close(fd);
return 0;
} |
If you send me (say to dronnikov@gmail.com) a gzipped snapshot of |
@dvv thank you. I sent a dump of |
My purpose is to be able to reconstruct the 1920x1080p video (on Yi home camera 1080p) from More than that, I'd hope to get a better understanding about the process of writing to So, to sum up, the following info will be super helpful.
|
@andy2301 It looks like your stream is in Try:
Success? |
@andy2301 on your last (2) In my empirical opinion NB: One of my attempts to dump the stream was to |
@dvv: Really a good idea, thank you for your work. Based on your idea; dirty but some parts of H264 stream can be received with throught RTSP |
\o/ |
Hi everyone, after some work I managed to use the SDK to compile @dvv's grab to test it on my Yi Dome 720p camera. I managed to get a 720p YUV stream at about 0.5 FPS (which is obviously too slow), so I tried to make the H264 stream work. After reading all of the issues regarding the RTSP on the projects (and trying to make it work without too much success: laggy or corrupted video) I noticed that a question that hasn't been answered yet is: "How can we know where the head/tail of the circular buffer is?" I might be wrong but the answer could be in the /proc/umap/h264e file, in fact in a tiny moment of rage I executed cat on the file multiple times and something was changing. In fact the The changing values can be checked with this command: I tried @Necromix's vencrtsp_v2 too without much success (laggy and corrupted video). Another thing I don't fully understand is how to calculate the Here's my
Edit: after some digging into /tmp/view it appears that the values
Btw I'm now trying to create a stable RTSP stream using @andy2301's rtsp2303. I had to edit some values (offsets, adjustIdx, chunk size and fps IIRC) to make the stream work. It's not really stable, the program randomly hangs but it's a step forward! Edit 2: I just locked out myself from the camera by executing rtsp2303 in init.sh without adding the "&" characted at the end to make it work in the background. Damn. Edit 3: Added the audio to the
|
I'm now working on a daemon which will populate a file in /tmp with the head/tail values, ready to be read by the RTSP server to sync the stream. I spent an afternoon optimizing it, the cpu usage with a check rate of 200ms is about 0.5/1.5%, the memory footprint is something around 100k but it can be reduced at the cost of accuracy. The first version needed at least 20/30% of cpu time and 1.5MB of ram. The quick jumps of the offsets you can see in the gif could be the keyframes, they take more space than the smaller packets. |
Hello everyone! Here's my analysis of /tmp/view on my Yi 1080P Dome. I'm new to the device (Christmas gift), so maybe it's not news for you guys but I wanted to share in case it helps at all. ;)
The Data #2,3,4 addresses slide around. I was able to use vencrtsp_v2 to stream both the SD and HD streams to VLC v3.0.4. I adjusted START_VIEW and SIZE_VIEW as appropriate, although my C is pretty rusty so I only use half of the HD data (and it still plays!) as setting SIZE_VIEW larger than 512K? gives a segmentation fault for some reason. The Messages window in VLC is full of late frame warnings and the video is delayed and jumpy, but the stream ran for 30+ minutes on each before I ended it. I believe VLC is very forgiving. I also tried streaming to MotionEye, but it fails to even start the stream. Edit: After reading more about H264, Data 2/3/4 is not NAL units. |
Hi @drlarsen77, |
Hey guys, I have a Yi 1080p Dome and tried to look into running vencrtsp_v2, but honestly am not sure what I should be changing in that script in order to match viewd's output. Just the START_VIEW and SIZE_VIEW variables? vlc is able to connect to the running stream as-is (fresh git clone), but there's no video or audio at all. Also, is there a separate repository with the extra files needed for running make for vencrtsp_v2? I can't seem to find |
@bharris6 You need the Hi3518E SDK in order to cross compile a binary that will work on your camera. Personally, I just created a vencrtsp_v2 folder under Hi3518E_SDK_V1.0.4.0/mpp/sample/ so that all the references lined up. If you set START_VIEW and SIZE_VIEW (be sure to convert from hex to dec) you'll be able to get a (glitchy) stream working in VLC. I find the SD stream works better, but they both do work since VLC is very forgiving. The HD stream's SIZE_VIEW is larger than the max allowed by some of the programs functions (you'll get a Segmentation Fault) but you can reduce it to something smaller. Anyway, that only gets you part of the way there. The vencrtsp_v2 program doesn't account for the cyclical video buffer, so all of that would need to be written. Basically for every read, it needs to check the current offset and move accordingly. @TheCrypt0 is working on yi-hack-v4 and hasn't released it yet (I don't have it yet either). I believe he plans to have something soon. Once that is released, it will incorporate his idea into a functioning RTSP server. |
@drlarsen77 ok thanks! I'll keep my eyes peeled for V4, and in the meantime I will look into that SDK. I knew I was missing something! |
I'm making a lot of progress with the RTSP server. Latest update:
Updates in the channel #rtsp-server of the Discord Server. Invite link. |
On camera:
On client:
This hack would effectively stream but I do not know how to tell
vlc
demuxer and codecs parameters. Anyvlc
guru to help me?The text was updated successfully, but these errors were encountered: