Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Video streaming latency and some other streaming problems #149

Open
maxsmirnov92 opened this issue May 14, 2015 · 7 comments
Open

Video streaming latency and some other streaming problems #149

maxsmirnov92 opened this issue May 14, 2015 · 7 comments
Labels

Comments

@maxsmirnov92
Copy link

I'm using JavaCV library 0.9 and FFmpegFrameRecorder class and "ffm" format for streaming video from Android device to ffserver using given feed file. I have few problems and it could be great if you help me with it. And btw sorry for my english :)

In all cases I'm currently using "avi" format for tests in ffserver conf. And I monitor current settings and feed state via stat.html (as some settings will be overriden with recorder start() i will see it on this page).

  1. Playback latency. When I connect to registered stream from VLC it takes a long time to get an actual image: ~10-30 seconds. Sound is currently not recording. I'm using MPEG4 codec (probably it's the only working codec at the moment), 480x320 frame size, 400 kbit/s target bitrate. Configured fps may be based on counted average fps ('auto' mode) by onPreviewFrame() callback or may be fixed 1..30; anyway before each record() call i verify that current frame match target interval:
            private boolean allowRecord(double targetFps) {
                    if (System.nanoTime() - recorderInterval >= (1000000000d / targetFps)) {
                            recorderInterval = System.nanoTime();
                            return true;
                    }
                    return false;
            }

so there are can't be 'redundant' frames to encode. The strange feature is the lower source fps (and consequently bitrate), than the more time needed to get an current frame in VLC (3 fps - almost one minute to wait!), during this time i got black screen or freezed first frame. And this time difference is staying during all streaming time.
For example, when hardware fps will be increased by camera.setRecorderHint(true) (between stopPreview() and startPreview() calls) to ~21, it takes only 20 seconds to wait in VLC.

I tried to start ffserver with different parameters in .conf file (like "FileMaxSize" (5M) in feed section or "VideoBufferSize" (5000), "VideoIntraOnly", "VideoGopSize" (12)), but result is almost the same.. Maybe I miss something, I don't know.

So, what is the possible reasons of playback latency? How I can fix it?

  1. After I disconnect VLC from stream, second try to connect fails and right after i get following errors in ffmpg.log on server:

Wed May 13 15:57:44 2015 [avi @ 0xb35254c]Too large number of skipped frames 526349 > 60000
Wed May 13 15:57:44 2015 Error writing frame to output for stream 'test2.avi': Invalid argument
...

After restarting stream by FFmpegFrameRecorder these errors dissapper and VLC succesfully connects to this stream (with same unacceptable latency). So with every client connect to stream, I should send command to device to restart streaming.
Unfortunately, it can be reproduced also when VLC already connected to stream: image will be freezed.

What "Too large number of skipped frames" error means and is it FFmpegFrameRecorder fault?

  1. Trying using different codecs as possible fix for these 2 problems.

H263. Same "too large number of skipped frames" problem reproduces, but a bit more latency (12 seconds vs 20 seconds with 20 fps and 480x320 (lower - a bigger latency)). Few times I got incorrect playback speed with followed by freezes, but now it seems to be ok. The very strange thing is when I set on recorder 480x320, on ffserver I see 704x576, if 320x240 - 352x288, so I see in VLC scaled image with proportions violation.

H264 (with option "preset", "ultrafast"). 176x144 frame, 20 fps. One time I succesfully connected from VLC to stream during few seconds and get image without any delays. But all following times I didn't get so lucky: 1 minute delay with 176x144 and 54 seconds with 480x320. Interesting moment is after .start() with this codec ffserver parameter "q" (quality) automatically changed to -1-1 (previous times with H263 and MPEG4 it was 2-31)

MJPEG. My colleague thinks that using that codec will solve problem. But when i set this codec (avcodec.AV_CODEC_ID_MJPEG) recorder throws exception (could not open codec). Is it any way to recompile your libraries with MJPEG support and is it a big sense to do that?

  1. Periodically falls off preview frames (no onPreviewFrame() callbacks) between stopStreaming(), startStreaming(), when using setRecorderHint() option. It's android bug, just saying)

  2. I did not succeeded with recording audio jointly with video frames: it throws sooner or later exception with "av_interleaved_write_frame() error" (don't remember error code). Caused by incorrect timestamps, I think. Tried to insert checks with "last used" timestamp, but it won't help.

  3. Frequently throwed "fatal signal 11". But it's not a big problem, because it handles by my app.

And some related to the subject pieces of code.

private void setPreviewCallback() {
    if (camera != null && isCameraLocked()) {
        synchronized (camera) {
            camera.setPreviewCallback(null);
            if (callbackBufferQueueSize > 0) {
                for (int i = 0; i < callbackBufferQueueSize; i++)
                    camera.addCallbackBuffer(allocatePreviewCallbackBuffer());
                logger.debug("setting preview callback with buffer...");
                camera.setPreviewCallbackWithBuffer(previewCallback);
            } else {
                expectedCallbackBufSize = 0;
                logger.debug("setting preview callback...");
                camera.setPreviewCallback(previewCallback);
            }
        }
    }
}

private boolean isPreviewStated = false;

private boolean startPreview() {
    if (!isPreviewStated) {
        if (camera != null && isCameraLocked()) {
            synchronized (camera) {
                logger.debug("starting preview...");
                try {
                    camera.startPreview();
                    isPreviewStated = true;
                    previewCallback.notifyPreviewStarted();
                    return true;
                } catch (RuntimeException e) {
                    logger.error("a RuntimeException occured during startPreview()", e);
                }
            }
        }
    } else {
        return true;
    }
    return false;
}

private boolean stopPreview() {
    if (isPreviewStated) {
        if (camera != null && isCameraLocked()) {
            synchronized (camera) {
                logger.debug("stopping preview...");
                try {
                    camera.stopPreview();
                    isPreviewStated = false;
                    // previewCallback.resetFpsCounter();
                    return true;
                } catch (RuntimeException e) {
                    logger.error("a RuntimeException occured during stopPreview()", e);
                }
            }
        }
    } else {
        return true;
    }
    return false;
}

private volatile IplImage yuvIplImage;

private volatile FFmpegFrameRecorder ffmpegRecorder;
private volatile boolean isFfmpegRecorderRecording = false;

private final static int FFMPEG_RECORDER_FPS_DEVIATION_LOW = 2;
private final static int FFMPEG_RECORDER_FPS_DEVIATION_HIGH = 4;
private final static int FFMPEG_RECORDER_FPS_CHECK_TIMEOUT = 15000;
private volatile boolean ffmpegAutoFps = false;

private synchronized boolean prepareFFmpegRecorder(StreamingSettings streamingSettings) {

    // check args

    if (streamingSettings == null) {
        logger.error("streamingSettings is null");
        return false;
    }

    if (streamingSettings.getFormat() == null || streamingSettings.getFormat().isEmpty()) {
        logger.error("format is null or empty");
        return false;
    }

    if (streamingSettings.isSuppressVideo()
            && (streamingSettings.isSuppressAudio() || streamingSettings.getAudioChannels() == StreamingSettings.AUDIO_CHANNELS_NONE)) {
        logger.error("video and audio are suppressed");
        return false;
    }

    if (!streamingSettings.isSuppressVideo()
            && (streamingSettings.getVideoCodec() == VIDEO_CODEC.NONE || streamingSettings.getVideoCodec() == null)) {
        logger.error("videoCodec is null or not specified");
        return false;
    }

    if (!streamingSettings.isSuppressAudio()
            && (streamingSettings.getAudioCodec() == AUDIO_CODEC.NONE || streamingSettings.getAudioCodec() == null)) {
        logger.error("audioCodec is null or not specified");
        return false;
    }

    if (streamingSettings.getVideoFrameWidth() <= 0 || streamingSettings.getVideoFrameHeight() <= 0) {
        logger.error("incorrect video frame size: " + streamingSettings.getVideoFrameWidth() + "x"
                + streamingSettings.getVideoFrameHeight());
        return false;
    }

    final int videoFrameRate;

    if (streamingSettings.getVideoFrameRate() > 0 && streamingSettings.getVideoFrameRate() <= StreamingSettings.VIDEO_FRAME_RATE_MAX) {
        videoFrameRate = streamingSettings.getVideoFrameRate();
    } else if (streamingSettings.getVideoFrameRate() == StreamingSettings.VIDEO_FRAME_RATE_AUTO) {

        if (previewCallback.getPreviousFps() == 0) {
            // wait for count
            try {
                Thread.sleep(2000);
            } catch (InterruptedException e) {
                logger.error("an InterruptedException occured during sleep()", e);
                Thread.currentThread().interrupt();
            }
        }

        if (previewCallback.getPreviousFps() > 0) {
            videoFrameRate = previewCallback.getPreviousFps();
            ffmpegAutoFps = true;
        } else
            videoFrameRate = StreamingSettings.VIDEO_FRAME_RATE_MAX;
    } else {
        logger.error("incorrect video frame rate: " + streamingSettings.getVideoFrameRate());
        return false;
    }

    if (!NetworkHelper.checkAddr(streamingSettings.getAddress())) {
        logger.error("incorrect address: " + streamingSettings.getAddress());
        return false;
    }

    InetAddress inetAddress = NetworkHelper.getInetAddressByDomain(streamingSettings.getAddress());
    if (inetAddress == null) {
        inetAddress = NetworkHelper.getInetAddressByIp(streamingSettings.getAddress());
        if (inetAddress == null) {
            logger.error("can't get inet address by name: " + streamingSettings.getAddress());
            return false;
        }
    }

    // if (NetworkHelper.isReachable(NetworkHelper.getInetAddressByIp(streamingSettings.getAddress()),
    // HostPingTask.DEFAULT_PING_COUNT,
    // NetworkHelper.safeLongToInt(HostPingTask.DEFAULT_TIMEOUT / 1000)) == NetworkHelper.PING_TIME_NONE) {
    // logger.error("host " + streamingSettings.getAddress() + " is unreachable");
    // return false;
    // }

    // check state

    if (camera == null) {
        logger.error("camera is null");
        return false;
    }

    if (!isSurfaceCreated()) {
        logger.error("surface is not created");
        return false;
    }

    if (isFfmpegRecorderRecording) {
        logger.error("FFmpegRecorder is already recording");
        return false;
    }

    // set camera params

    if ((this.previousPreviewFormat = getCameraPreviewFormat()) == null) {
        logger.error("current preview format is null");
        return false;
    }

    if (!setCameraPreviewFormat(IMAGE_FORMAT.NV21)) {
        logger.error("can't set preview format: " + IMAGE_FORMAT.NV21);
        return false;
    }

    if ((this.previousPreviewSize = getCameraPreviewSize()) == null) {
        logger.error("current preview size is null");
        return false;
    }

    if (!isPreviewSizeSupported(streamingSettings.getVideoFrameWidth(), streamingSettings.getVideoFrameHeight())) {
        logger.error("preview size " + streamingSettings.getVideoFrameWidth() + "x" + streamingSettings.getVideoFrameHeight()
                + " is not supported, source preview frames will be resized");

        Size highPreviewSize = findHighSize(getSupportedPreviewSizes());

        if (highPreviewSize == null) {
            logger.error("high preview size is null");
            return false;
        }

        if (!setCameraPreviewSize(highPreviewSize)) {
            logger.error("can't set high preview size: " + highPreviewSize.width + "x" + highPreviewSize.height);
            return false;
        }

    } else {

        if (!setCameraPreviewSize(streamingSettings.getVideoFrameWidth(), streamingSettings.getVideoFrameHeight())) {
            logger.error("can't set preview size: " + streamingSettings.getVideoFrameWidth() + "x"
                    + streamingSettings.getVideoFrameHeight());
            return false;
        }

    }

    if (callbackBufferQueueSize > 0) {
        stopPreview();

        setRecordingHint(true);

        startPreview();
        setPreviewCallback();
    }

    // prepare recorder and buffer

    ffmpegRecorder = new FFmpegFrameRecorder("http://" + inetAddress.getHostAddress() + ":" + streamingSettings.getPort() + "/"
            + streamingSettings.getFeedName(), streamingSettings.getVideoFrameWidth(), streamingSettings.getVideoFrameHeight(),
            streamingSettings.getAudioChannels() != StreamingSettings.AUDIO_CHANNELS_NONE ? streamingSettings.getAudioChannels() : 0);

    ffmpegRecorder.setFormat(streamingSettings.getFormat());
    ffmpegRecorder.setInterleaved(true);

    if (!streamingSettings.isSuppressVideo()) {

        logger.debug("setting video parameters...");

        ffmpegRecorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
        if (streamingSettings.getVideoGopSize() > 0)
            ffmpegRecorder.setGopSize(streamingSettings.getVideoGopSize());
        ffmpegRecorder.setVideoCodec(streamingSettings.getVideoCodec().getValue());
        if (streamingSettings.getVideoCodec() == VIDEO_CODEC.H264) {
            ffmpegRecorder.setVideoOption("preset", "ultrafast");
            ffmpegRecorder.setVideoOption("tune", "zerolatency");
        }
        if (streamingSettings.getVideoBitrate() > 0)
            ffmpegRecorder.setVideoBitrate(streamingSettings.getVideoBitrate());
        ffmpegRecorder.setFrameRate(videoFrameRate);

        ffmpegRecorder.setVideoOption("nobuffer", "ultrafast");

    } else {
        ffmpegRecorder.setImageWidth(-1);
        ffmpegRecorder.setImageHeight(-1);
        ffmpegRecorder.setPixelFormat(-1);
        ffmpegRecorder.setGopSize(-1);
        ffmpegRecorder.setVideoCodec(avcodec.AV_CODEC_ID_NONE);
        ffmpegRecorder.setVideoBitrate(-1);
        ffmpegRecorder.setFrameRate(-1);
    }

    if (!streamingSettings.isSuppressAudio() && streamingSettings.getAudioChannels() != StreamingSettings.AUDIO_CHANNELS_NONE) {

        logger.debug("setting audio parameters...");

        ffmpegRecorder.setAudioCodec(streamingSettings.getAudioCodec().getValue());
        if (streamingSettings.getAudioBitrate() > 0)
            ffmpegRecorder.setAudioBitrate(streamingSettings.getAudioBitrate());
        if (streamingSettings.getAudioSampleRate() > 0)
            ffmpegRecorder.setSampleRate(streamingSettings.getAudioSampleRate());
    } else {
        ffmpegRecorder.setAudioChannels(StreamingSettings.AUDIO_CHANNELS_NONE);
        ffmpegRecorder.setAudioCodec(avcodec.AV_CODEC_ID_NONE);
        ffmpegRecorder.setAudioBitrate(-1);
        ffmpegRecorder.setSampleRate(-1);
    }

    logger.debug("FFmpegFrameRecorder parameters : [size: " + ffmpegRecorder.getImageWidth() + "x" + ffmpegRecorder.getImageHeight()
            + "], [format: " + ffmpegRecorder.getFormat() + "], [pixel format: " + ffmpegRecorder.getPixelFormat()
            + "], [interleaved: " + ffmpegRecorder.isInterleaved() + "], [video gop size: " + ffmpegRecorder.getGopSize()
            + "], [video codec: " + ffmpegRecorder.getVideoCodec() + "], [video bitrate: " + ffmpegRecorder.getVideoBitrate()
            + "], [video frame rate: " + ffmpegRecorder.getFrameRate() + "], [audio channels: " + ffmpegRecorder.getAudioChannels()
            + "], [audio codec: " + ffmpegRecorder.getAudioCodec() + "], [audio bitrate: " + ffmpegRecorder.getAudioBitrate()
            + "], [audio sample rate: " + ffmpegRecorder.getSampleRate() + "]");

    yuvIplImage = IplImage.create(streamingSettings.getVideoFrameWidth(), streamingSettings.getVideoFrameHeight(),
            opencv_core.IPL_DEPTH_8U, 2);

    return true;
}

private synchronized void releaseFfmpegRecorder() {
    logger.debug("releaseFfmpegRecorder()");

    if (ffmpegRecorder != null) {

        try {
            ffmpegRecorder.release();
        } catch (FrameRecorder.Exception e) {
            logger.error("a FrameRecorder.Exception occured during release()", e);
        }

        ffmpegRecorder = null;
    }

    if (yuvIplImage != null) {
        yuvIplImage.release();
        yuvIplImage = null;
    }

    if (previousPreviewFormat != null) {
        if (!setCameraPreviewFormat(previousPreviewFormat)) {
            logger.error("can't set preview format: " + previousPreviewFormat);
        }
        previousPreviewFormat = null;
    }

    if (previousPreviewSize != null) {
        if (!setCameraPreviewSize(previousPreviewSize)) {
            logger.error("can't set preview size: " + previousPreviewSize);
        }
        previousPreviewSize = null;
    }

    if (callbackBufferQueueSize > 0) {
        stopPreview();

        setRecordingHint(false);

        startPreview();
        setPreviewCallback();
    }
}

/** used to restore preview format after stop streaming */
private IMAGE_FORMAT previousPreviewFormat;

/** used to restore preview size after stop streaming */
private Size previousPreviewSize;

private volatile long startStreamingTime;
private volatile long ffmpegLastUsedTimestamp = 0;

public synchronized boolean startStreaming(StreamingSettings streamingSettings) {
    logger.debug("startStreaming(), streamingSettings=" + streamingSettings);

    if (currentCameraState != CAMERA_STATE.IDLE) {
        logger.error("current camera state is not IDLE! state is " + currentCameraState);
        return currentCameraState == CAMERA_STATE.STREAMING;
    }

    if (prepareFFmpegRecorder(streamingSettings)) {

        try {
            executor.submit(new Callable<Boolean>() {

                @Override
                public Boolean call() throws Exception {
                    // try {
                    ffmpegRecorder.start();
                    // } catch (FrameRecorder.Exception e) {
                    // logger.error("a FrameRecorder.Exception occured during start()", e);
                    // }
                    return null;
                }
            }).get(EXECUTOR_CALL_TIMEOUT, TimeUnit.SECONDS);

        } catch (Exception e) {
            logger.error("an Exception occured during get()", e);
            releaseFfmpegRecorder();
            return false;
        }

        if (!streamingSettings.isSuppressAudio() && streamingSettings.getAudioChannels() != StreamingSettings.AUDIO_CHANNELS_NONE)
            startAudioRecordThread(streamingSettings);

        startStreamingTime = System.currentTimeMillis();
        isFfmpegRecorderRecording = true;

        setCurrentCameraState(CAMERA_STATE.STREAMING);
        logger.debug("streaming has been started");

        return true;
    }

    return false;
}

public synchronized void stopStreaming() {
    logger.debug("stopStreaming()");

    if (isFfmpegRecorderRecording) {

        stopAudioRecordThread();

        isFfmpegRecorderRecording = false;
        ffmpegAutoFps = false;

        startStreamingTime = 0;
        ffmpegLastUsedTimestamp = 0;
        previewCallback.recorderInterval = 0;

        try {
            executor.submit(new Callable<Boolean>() {

                @Override
                public Boolean call() throws Exception {
                    // try {
                    ffmpegRecorder.stop();
                    // } catch (FrameRecorder.Exception e) {
                    // logger.error("a FrameRecorder.Exception occured during stop()", e);
                    // }
                    return null;
                }
            }).get(EXECUTOR_CALL_TIMEOUT, TimeUnit.SECONDS);

        } catch (Exception e) {
            logger.error("an Exception occured during get()", e);
            ffmpegRecorder = null;
        }

        setCurrentCameraState(CAMERA_STATE.IDLE);
        logger.debug("streaming has been stopped");

        releaseFfmpegRecorder();

    } else {
        logger.debug("ffmpegRecorder is not recording");
    }
}

private AudioRecordThread audioRecordThread;

private boolean isAudioRecordThreadRunning() {
    return (audioRecordThread != null && audioRecordThread.isAlive());
}

private void stopAudioRecordThread() {
    logger.debug("stopAudioRecordThread()");

    if (!isAudioRecordThreadRunning())
        return;

    audioRecordThread.interrupt();
    if (audioRecordThread.isAlive()) {
        try {
            audioRecordThread.join();
        } catch (InterruptedException e) {
            logger.error("an InterruptedException occured during join()", e);
        }
    }
    audioRecordThread = null;
}

private void startAudioRecordThread(StreamingSettings settings) {
    logger.debug("startAudioRecordThread()");

    if (settings == null)
        return;

    if (isAudioRecordThreadRunning())
        return;

    audioRecordThread = new AudioRecordThread(settings.getAudioSampleRate(), settings.getAudioChannels());
    audioRecordThread.setName(AudioRecordThread.class.getSimpleName());
    audioRecordThread.start();
}

// ---------------------------------------------
// audio thread, gets and encodes audio data
// ---------------------------------------------
private class AudioRecordThread extends Thread {

    private final int sampleAudioRateHz;

    private final int bufferSize;
    private final short[] audioData;

    private final AudioRecord audioRecord;

    @SuppressWarnings("deprecation")
    public AudioRecordThread(int sampleAudioRateHz, int audioChannels) {
        this.sampleAudioRateHz = sampleAudioRateHz <= 0 ? StreamingSettings.DEFAULT_SAMPLE_AUDIO_RATE_HZ : sampleAudioRateHz;

        final int channelConfiguration;

        switch (audioChannels) {
        case StreamingSettings.AUDIO_CHANNELS_MONO:
            channelConfiguration = AudioFormat.CHANNEL_CONFIGURATION_MONO;
            break;
        case StreamingSettings.AUDIO_CHANNELS_STEREO:
            channelConfiguration = AudioFormat.CHANNEL_CONFIGURATION_STEREO;
            break;
        default:
            channelConfiguration = AudioFormat.CHANNEL_CONFIGURATION_DEFAULT;
            break;
        }

        this.bufferSize = AudioRecord.getMinBufferSize(this.sampleAudioRateHz, channelConfiguration, AudioFormat.ENCODING_PCM_16BIT);
        this.audioData = new short[bufferSize];

        audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, this.sampleAudioRateHz, channelConfiguration,
                AudioFormat.ENCODING_PCM_16BIT, this.bufferSize);
    }

    @Override
    public void run() {

        try {
            android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
        } catch (Exception e) {
            logger.error("an Exception occured during setThreadPriority()", e);
        }

        logger.debug("starting audio recording...");
        audioRecord.startRecording();
        logger.debug("audio recording has been started");

        int bufferReadResult = 0;

        // Audio Capture/Encoding Loop
        while (!isInterrupted()) {

            // Read from audioRecord
            bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
            if (bufferReadResult > 0) {

                if (isFfmpegRecorderRecording && ffmpegRecorder != null && ffmpegRecorder.getAudioChannels() > 0
                        && ffmpegRecorder.getAudioCodec() > 0) {

                    // long timestamp = startStreamingTime > 0 ? (1000 * (System.currentTimeMillis() -
                    // startStreamingTime)) : 0;

                    // if (timestamp > ffmpegRecorder.getTimestamp()) {
                    // logger.warn("(audio) setting timestamp " + timestamp + "...");
                    // ffmpegRecorder.setTimestamp(timestamp);
                    // }
                    // else {
                    // logger.error("(audio) current timestamp " + timestamp + " equals or less setted "
                    // + ffmpegRecorder.getTimestamp());
                    // continue;
                    // }

                    if (ffmpegLastUsedTimestamp == ffmpegRecorder.getTimestamp()) {
                        logger.warn("(audio) last used timestamp " + ffmpegLastUsedTimestamp + " equals setted "
                                + ffmpegRecorder.getTimestamp());
                        continue;
                    }

                    logger.debug("(audio) writing sample (timestamp:" + ffmpegRecorder.getTimestamp() + ") ...");
                    try {
                        ffmpegLastUsedTimestamp = ffmpegRecorder.getTimestamp();
                        ffmpegRecorder.record(ShortBuffer.wrap(audioData, 0, bufferReadResult));
                    } catch (FFmpegFrameRecorder.Exception e) {
                        logger.error("a FFmpegFrameRecorder.Exception occured during record()", e);

                        // stopStreaming();
                        //
                        // synchronized (ffmpegRecorderErrorListeners) {
                        // if (ffmpegRecorderErrorListeners.size() > 0) {
                        // for (OnFFmpegRecorderErrorListener l : ffmpegRecorderErrorListeners) {
                        // l.onFFmpegRecorderError(e);
                        // }
                        // }
                        // }

                    }
                }
            }
        }

        logger.debug("stopping audio recording...");
        audioRecord.stop();
        audioRecord.release();
        logger.debug("audio recording has been stopped");
    }
}


    @Override
    public void onPreviewFrame(final byte[] data, final Camera camera) {

if (isFfmpegRecorderRecording && yuvIplImage != null && ffmpegRecorder != null && ffmpegRecorder.getVideoCodec() > 0) {

                if (previewFormat == null || previewWidth <= 0 || previewHeight <= 0) {
                    logger.error("preview format or preview size not setted");
                    return;
                }

                int deviation = callbackBufferQueueSize > 0 ? FFMPEG_RECORDER_FPS_DEVIATION_HIGH : FFMPEG_RECORDER_FPS_DEVIATION_LOW;

                if (ffmpegAutoFps && Math.abs(averageFps - ffmpegRecorder.getFrameRate()) > deviation) {

                    if (startStreamingTime > 0 && (System.currentTimeMillis() - startStreamingTime > FFMPEG_RECORDER_FPS_CHECK_TIMEOUT)) {

                        String msg = "fps mode is auto and target recorder fps (" + ffmpegRecorder.getFrameRate()
                                + ") do not match actual average (" + averageFps + ") !";
                        logger.warn(msg);

                        previousFps = averageFps;

                        stopStreaming();

                        synchronized (ffmpegRecorderErrorListeners) {
                            if (ffmpegRecorderErrorListeners.size() > 0) {
                                for (OnFFmpegRecorderErrorListener l : ffmpegRecorderErrorListeners) {
                                    l.onFFmpegRecorderError(new IllegalStateException(msg));
                                }
                            }
                        }

                    }

                    return;

                } else if (!allowRecord(ffmpegRecorder.getFrameRate())) {
                    return;
                }

                // synchronized (yuvIplImage) {

                // logger.debug("preview size: " + previewWidth + "x" + previewHeight +
                // ", yuvIplImage size: " +
                // yuvIplImage.width()
                // + "x" + yuvIplImage.height());

                ByteBuffer imageBuffer = yuvIplImage.getByteBuffer(); // resets the buffer

                if (previewHeight > yuvIplImage.height()) {

                    final int startY = previewWidth * (previewHeight - yuvIplImage.height()) / 2;
                    final int lenY = previewWidth * yuvIplImage.height();
                    imageBuffer.put(data, startY, lenY);
                    final int startVU = previewWidth * previewHeight + previewWidth * (previewHeight - yuvIplImage.height()) / 4;
                    final int lenVU = previewWidth * yuvIplImage.height() / 2;
                    imageBuffer.put(data, startVU, lenVU);

                } else if (previewHeight == yuvIplImage.height()) {

                    imageBuffer.put(data);

                } else {

                    // TODO
                }

                long timestamp = startStreamingTime > 0 ? (1000 * (System.currentTimeMillis() - startStreamingTime)) : 0;

                if (timestamp > ffmpegRecorder.getTimestamp()) {
                    // logger.info("(video) setting current timestamp " + timestamp + "...");
                    ffmpegRecorder.setTimestamp(timestamp);
                }
                // else {
                // logger.error("(video) current timestamp " + timestamp + " equals or less setted " +
                // ffmpegRecorder.getTimestamp());
                // return;
                // }

                if (ffmpegLastUsedTimestamp == ffmpegRecorder.getTimestamp()) {
                    logger.warn("(video) last used timestamp " + ffmpegLastUsedTimestamp + " equals setted "
                            + ffmpegRecorder.getTimestamp());
                    return;
                }

                logger.debug("(video) writing frame " + yuvIplImage.width() + "x" + yuvIplImage.height() + " (timestamp:"
                        + ffmpegRecorder.getTimestamp() + ") ...");
                try {
                    ffmpegLastUsedTimestamp = ffmpegRecorder.getTimestamp();
                    ffmpegRecorder.record(yuvIplImage); // , avutil.AV_PIX_FMT_YUV420P
                } catch (FrameRecorder.Exception e) {
                    logger.error("a FrameRecorder.Exception occured during record()", e);

                    previousFps = averageFps;

                    stopStreaming();

                    synchronized (ffmpegRecorderErrorListeners) {
                        if (ffmpegRecorderErrorListeners.size() > 0) {
                            for (OnFFmpegRecorderErrorListener l : ffmpegRecorderErrorListeners) {
                                l.onFFmpegRecorderError(e);
                            }
                        }
                    }

                    return;

                }

                // }

            }

        } finally {
            if (camera != null && isCameraLocked() && callbackBufferQueueSize > 0) {
                camera.addCallbackBuffer(data);
            }
        }
    }

Thanks in advance!

@saudet
Copy link
Member

saudet commented May 17, 2015

  1. This is probably related to buffering. There's some tips about here for example:
    http://comments.gmane.org/gmane.comp.video.ffmpeg.libav.user/3888
    But I think that's for the client side... Anyway, we should be able to set such options with FFmpegFrameRecorder.setOption(). Let us know if you make any progress! Thanks

  2. AVI wasn't meant for streaming. You're probably going to need to use MKV or something... BTW, how does ffserver behave? Does it work properly even with AVI?

  3. Changing codecs probably won't fix that no. You need to try to change format (like MKV). Anyway, MJPEG should be in there. What error do you get? Sure, we can recompile everything from scratch:
    https://github.com/bytedeco/javacpp-presets/#build-instructions

  4. Good to know! Would you have a patch to contribute for the RecordActivity.java sample? Thanks in advance!

  5. There's probably some synchronization issues involved here. Changing format might help: For one thing, AVI isn't very good at that.

  6. Is this something that happens with ffserver too?

@maxsmirnov92
Copy link
Author

  1. Now I'm using 0.11 version.

set those 3 parameters:
ffmpegRecorder.setVideoOption("max_picture_buffer", "51200");
ffmpegRecorder.setVideoOption("probesize", "192");
ffmpegRecorder.setVideoOption("max_index_size", "51200");

but I can test it now, because current android device has x86 architecture. I placed native libraries from JavaCV 0.11 to libs/armeabi and libs/x86 folders: ARM-device works correctly (as it was on 0.9 ver, with same unaccepatable playback latency), but on my x86 JVM throws exception:
java.lang.NoClassDefFoundError: org.bytedeco.javacv.OpenCVFrameConverter$ToIplImage

how I can fix it?

  1. I have changed "avi" format to "mkv" in ffserver.conf and got this error while starting ffserver:
    "/etc/ffserver.conf:190: Unknown Format: 'mkv'"

In sample config lists following supported formats:

Format of the stream : you can choose among:
mpeg : MPEG-1 multiplexed video and audio
mpegvideo : only MPEG-1 video
mp2 : MPEG-2 audio (use AudioCodec to select layer 2 and 3 codec)
ogg : Ogg format (Vorbis audio codec)
rm : RealNetworks-compatible stream. Multiplexed audio and video.
ra : RealNetworks-compatible stream. Audio only.
mpjpeg : Multipart JPEG (works with Netscape without any plugin)
jpeg : Generate a single JPEG image.
asf : ASF compatible streaming (Windows Media Player format).
swf : Macromedia Flash compatible stream
avi : AVI format (MPEG-4 video, MPEG audio sound)

I used avi because it seems to be used with MPEG-4 codec.. So which format is suitable for streaming in that case?

  1. It throws exception at start() if avcodec.AV_CODEC_ID_MJPEG was setted:
    "java.util.concurrent.ExecutionException: org.bytedeco.javacv.FrameRecorder$Exception: avcodec_open2() error -22: Could not open video codec."

  2. I have a bit different code over RecordActivity.class (didn't launch that example). One of possible reason of falling off preview frames is calling setPreviewCallback() or setPreviewCallbackWithBuffer() before startPreview() (it doesn't reproduce every time)

  3. Yes, most likely it's synchronization problems...

  4. No, it doesn't affect ffserver. FFmpegFrameRecorder could randomly crash, but no errors in ffserver log or something like that.

@maxsmirnov92
Copy link
Author

I changed format to "matroska" and now it's much more acceptable playback latency: 3-5 seconds with MPEG4 / libx264 / H263 codecs. Setted options not affect largerly on it. And after re-connecting from VLC, there are no "skipped frames" errors in log. With starting streaming I receive warning: "[matroska @ 0xa9dec8c]Codec for stream 0 does not use global headers but container format requires global headers", if using codec any of this codecs, but it doesn't really matter.
Now problem with synchronization when recording audio still remains. Same exception:
"org.bytedeco.javacv.FrameRecorder$Exception: av_interleaved_write_frame() error -104 while writing interleaved audio frame."

@saudet
Copy link
Member

saudet commented May 23, 2015

  1. x86 works fine in the emulator here, so I don't believe this is the issue...

  2. This looks like an old list of codecs... The original MPEG was good for streaming :)

  3. You're going to need to change the pixel format as well if you want to use MJPEG:
    mjpeg @ 0x7fbe000f6200] Specified pixel format yuv420p is invalid or not supported

  4. Still, if you have a patch to enhance the RecordActivity.java sample, it would be great! Thanks

  5. I meant with your application, it doesn't look like it's recording audio and video in the required amount of frames.

  6. Sorry, I meant ffmpeg. What happens if you use ffmpeg on the command line to stream audio and video to ffserver?

And what happens if you use ffplay instead of VLC?

@saudet
Copy link
Member

saudet commented Oct 12, 2015

Any updates on this? Have you gotten everything running alright?

@maxsmirnov92
Copy link
Author

Last time I tested this was 3 month ago. Video streaming worked fine enough with matroska format, but without sound recording and recorder could not be initialized with MJPEG codec. Now I don't work on this project anymore. Thanks for your help anyway.

@saudet
Copy link
Member

saudet commented May 19, 2016

FYI, I've found the issue with MJPEG (#410).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants