Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error -104 while writing interleaved audio frame #267

Open
jmbernal opened this issue Nov 4, 2015 · 19 comments
Open

Error -104 while writing interleaved audio frame #267

jmbernal opened this issue Nov 4, 2015 · 19 comments

Comments

@jmbernal
Copy link

jmbernal commented Nov 4, 2015

Hello all.

I am trying to streaming video with ffmpeg from android (using RecordActivity.java) to ffserver.

My ffserver.conf is:

HTTPPort 8090
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 5
MaxBandwidth 10000
CustomLog -

<Feed feed1.ffm>
    File /tmp/feed1.ffm
    FileMaxSize 20M
</Feed>

<Stream live.flv>
    Feed feed1.ffm
    Format mpeg
    AudioBitRate 64
    AudioChannels 2
    AudioSampleRate 44100
    VideoBitRate 128
    VideoFrameRate 30
    VideoSize 320x240
    NoAudio
</Stream>

Then, when I try to record recorderStreaming.recordSamples(audioData), the exception is throw:

11-04 12:21:13.222 19213-19715/recordactivity.javacv.bytedeco.org.recordactivity W/System.err: org.bytedeco.javacv.FrameRecorder$Exception: av_interleaved_write_frame() error -104 while writing interleaved audio frame.
11-04 12:21:13.222 19213-19715/recordactivity.javacv.bytedeco.org.recordactivity W/System.err:     at org.bytedeco.javacv.FFmpegFrameRecorder.record(FFmpegFrameRecorder.java:965)
11-04 12:21:13.222 19213-19715/recordactivity.javacv.bytedeco.org.recordactivity W/System.err:     at org.bytedeco.javacv.FFmpegFrameRecorder.recordSamples(FFmpegFrameRecorder.java:930)
11-04 12:21:13.222 19213-19715/recordactivity.javacv.bytedeco.org.recordactivity W/System.err:     at org.bytedeco.javacv.FFmpegFrameRecorder.recordSamples(FFmpegFrameRecorder.java:803)
11-04 12:21:13.222 19213-19715/recordactivity.javacv.bytedeco.org.recordactivity W/System.err:     at com.example.videostreaming.RecordActivity$AudioRecordRunnable.run(RecordActivity.java:398)
11-04 12:21:13.222 19213-19715/recordactivity.javacv.bytedeco.org.recordactivity W/System.err:     at java.lang.Thread.run(Thread.java:841)

This is my full RecordActivity.java:

package com.example.videostreaming;

import ...;

public class RecordActivity extends Activity implements OnClickListener {
    private final static String CLASS_LABEL = "RecordActivity";
    private final static String LOG_TAG = CLASS_LABEL;
    private PowerManager.WakeLock mWakeLock;
    private final static String ffmpeg_link = "http://192.168.26.162:8090/feed1.ffm";
    private String ffmpeg_file = Environment.getExternalStorageDirectory().getPath() + "/DCIM/stream.mpeg";
    long startTime = 0;
    boolean recording = false;
    private FFmpegFrameRecorder recorderStreaming;
    private FFmpegFrameRecorder recorderSDCard;
    private boolean isPreviewOn = false;
    private final static String videoFormat= "mpeg";
    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 320;
    private int imageHeight = 240;
    private int videoFrameRate = 25;
    /* audio data getting thread */
    private AudioRecord audioRecord;
    private AudioRecordRunnable audioRecordRunnable;
    private Thread audioThread;
    volatile boolean runAudioThread = true;

    /* video data getting thread */
    private Camera cameraDevice;
    private CameraView cameraView;

    private Frame yuvImage = null;

    /* layout setting */
    private final int bg_screen_bx = 232;
    private final int bg_screen_by = 128;
    private final int bg_screen_width = 700;
    private final int bg_screen_height = 500;
    private final int bg_width = 1123;
    private final int bg_height = 715;
    private final int live_width = 640;
    private final int live_height = 480;
    private int screenWidth, screenHeight;
    private Button btnRecorderControl;

    /* The number of seconds in the continuous record loop (or 0 to disable loop). */
    //final int RECORD_LENGTH = 10;
    final int RECORD_LENGTH = 0;
    Frame[] images;
    long[] timestamps;
    ShortBuffer[] samples;
    int imagesIndex, samplesIndex;

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

        setContentView(R.layout.content_record);

        PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
        mWakeLock = pm.newWakeLock(PowerManager.PROXIMITY_SCREEN_OFF_WAKE_LOCK, CLASS_LABEL);
        mWakeLock.acquire();

        initLayout();
    }


    @Override
    protected void onResume() {
        super.onResume();
        Log.i(LOG_TAG, "ON_Resume");
        if (mWakeLock == null) {
            PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
            mWakeLock = pm.newWakeLock(PowerManager.PROXIMITY_SCREEN_OFF_WAKE_LOCK, CLASS_LABEL);
            mWakeLock.acquire();
        }
    }

    @Override
    protected void onPause() {
        super.onPause();
        Log.i(LOG_TAG, "ON_Pause");
        /*if (mWakeLock != null) {
            mWakeLock.release();
            mWakeLock = null;
        }*/
    }

    @Override
    protected void onStop() {
        super.onStop();
        Log.i(LOG_TAG, "ON_Stop");
    }

    @Override
    protected void onRestart() {
        super.onRestart();
        Log.i(LOG_TAG, "ON_Restart");
    }

    @Override
    protected void onDestroy() {
        super.onDestroy();
        Log.i(LOG_TAG, "ON_Destroy");
        recording = false;

        if (cameraView != null) {
            cameraView.stopPreview();
        }

        if (cameraDevice != null) {
            cameraDevice.stopPreview();
            cameraDevice.release();
            cameraDevice = null;
        }

        if (mWakeLock != null) {
            mWakeLock.release();
            mWakeLock = null;
        }
    }


    private void initLayout() {

        /* get size of screen */
        Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
        screenWidth = display.getWidth();
        screenHeight = display.getHeight();
        RelativeLayout.LayoutParams layoutParam = null;
        LayoutInflater myInflate = null;
        myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
        RelativeLayout topLayout = new RelativeLayout(this);
        setContentView(topLayout);
        LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(R.layout.content_record, null);
        layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
        topLayout.addView(preViewLayout, layoutParam);

        /* add control button: start and stop */
        btnRecorderControl = (Button) findViewById(R.id.recorder_control);
        btnRecorderControl.setText("Start");
        btnRecorderControl.setOnClickListener(this);

        /* add camera view */
        int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
        int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
        int prev_rw, prev_rh;
        if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
            prev_rh = display_height_d;
            prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
        } else {
            prev_rw = display_width_d;
            prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
        }
        layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
        layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
        layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);

        cameraDevice = Camera.open();
        Log.i(LOG_TAG, "cameara open");
        cameraView = new CameraView(this, cameraDevice);
        topLayout.addView(cameraView, layoutParam);
        Log.i(LOG_TAG, "cameara preview start: OK");
    }

    //---------------------------------------
    // initialize ffmpeg_recorder
    //---------------------------------------
    private void initRecorder() {

        Log.w(LOG_TAG, "init recorderStreaming");

        if (RECORD_LENGTH > 0) {
            imagesIndex = 0;
            images = new Frame[RECORD_LENGTH * videoFrameRate];
            timestamps = new long[images.length];
            for (int i = 0; i < images.length; i++) {
                images[i] = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
                timestamps[i] = -1;
            }
        } else if (yuvImage == null) {
            yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
            Log.i(LOG_TAG, "create yuvImage");
        }

        Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
        recorderStreaming = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
        recorderStreaming.setFormat(videoFormat);
        recorderStreaming.setSampleRate(sampleAudioRateInHz);
        recorderStreaming.setFrameRate(videoFrameRate);

        recorderSDCard = new FFmpegFrameRecorder(ffmpeg_file, imageWidth, imageHeight, 1);
        recorderSDCard.setFormat("flv");
        recorderSDCard.setSampleRate(sampleAudioRateInHz);
        recorderSDCard.setFrameRate(videoFrameRate);

        Log.i(LOG_TAG, "recorderStreaming initialize success");

        audioRecordRunnable = new AudioRecordRunnable();
        audioThread = new Thread(audioRecordRunnable);
        runAudioThread = true;
    }

    public void startRecording() {

        initRecorder();

        try {
            recorderStreaming.start();
            recorderSDCard.start();
            startTime = System.currentTimeMillis();
            recording = true;
            audioThread.start();

        } catch (FFmpegFrameRecorder.Exception e) {
            e.printStackTrace();
        }
    }

    public void stopRecording() {

        runAudioThread = false;
        try {
            audioThread.join();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
        audioRecordRunnable = null;
        audioThread = null;

        if (recorderStreaming != null && recorderSDCard != null && recording) {
            if (RECORD_LENGTH > 0) {
                Log.v(LOG_TAG, "Writing frames");
                try {
                    int firstIndex = imagesIndex % samples.length;
                    int lastIndex = (imagesIndex - 1) % images.length;
                    if (imagesIndex <= images.length) {
                        firstIndex = 0;
                        lastIndex = imagesIndex - 1;
                    }
                    if ((startTime = timestamps[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
                        startTime = 0;
                    }
                    if (lastIndex < firstIndex) {
                        lastIndex += images.length;
                    }
                    for (int i = firstIndex; i <= lastIndex; i++) {
                        long t = timestamps[i % timestamps.length] - startTime;
                        if (t >= 0) {
                            if (t > recorderStreaming.getTimestamp()) {
                                recorderStreaming.setTimestamp(t);
                            }
                            if (t > recorderSDCard.getTimestamp()) {
                                recorderSDCard.setTimestamp(t);
                            }
                            recorderStreaming.record(images[i % images.length]);
                            recorderSDCard.record(images[i % images.length]);
                        }
                    }

                    firstIndex = samplesIndex % samples.length;
                    lastIndex = (samplesIndex - 1) % samples.length;
                    if (samplesIndex <= samples.length) {
                        firstIndex = 0;
                        lastIndex = samplesIndex - 1;
                    }
                    if (lastIndex < firstIndex) {
                        lastIndex += samples.length;
                    }
                    for (int i = firstIndex; i <= lastIndex; i++) {
                        recorderStreaming.recordSamples(samples[i % samples.length]);
                        recorderSDCard.recordSamples(samples[i % samples.length]);
                    }
                } catch (FFmpegFrameRecorder.Exception e) {
                    Log.v(LOG_TAG, e.getMessage());
                    e.printStackTrace();
                }
            }

            recording = false;
            Log.v(LOG_TAG, "Finishing recording, calling stop and release on recorderStreaming");
            try {
                recorderStreaming.stop();
                recorderStreaming.release();
                recorderSDCard.stop();
                recorderSDCard.release();
            } catch (FFmpegFrameRecorder.Exception e) {
                e.printStackTrace();
            }
            recorderStreaming = null;
            recorderSDCard = null;
        }
    }

    @Override
    public boolean onKeyDown(int keyCode, KeyEvent event) {

        if (keyCode == KeyEvent.KEYCODE_BACK) {
            if (recording) {
                stopRecording();
            }

            finish();

            return true;
        }

        return super.onKeyDown(keyCode, event);
    }


    //---------------------------------------------
    // audio thread, gets and encodes audio data
    //---------------------------------------------
    class AudioRecordRunnable implements Runnable {

        @Override
        public void run() {
            android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

            // Audio
            int bufferSize;
            ShortBuffer audioData;
            int bufferReadResult;

            bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                    AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
            audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                    AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

            if (RECORD_LENGTH > 0) {
                samplesIndex = 0;
                samples = new ShortBuffer[RECORD_LENGTH * sampleAudioRateInHz * 2 / bufferSize + 1];
                for (int i = 0; i < samples.length; i++) {
                    samples[i] = ShortBuffer.allocate(bufferSize);
                }
            } else {
                audioData = ShortBuffer.allocate(bufferSize);
            }

            Log.d(LOG_TAG, "audioRecord.startRecording()");
            audioRecord.startRecording();

            /* ffmpeg_audio encoding loop */
            while (runAudioThread) {
                if (RECORD_LENGTH > 0) {
                    audioData = samples[samplesIndex++ % samples.length];
                    audioData.position(0).limit(0);
                }
                //Log.v(LOG_TAG,"recording? " + recording);
                bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
                audioData.limit(bufferReadResult);
                if (bufferReadResult > 0) {
                    Log.v(LOG_TAG, "bufferReadResult: " + bufferReadResult);
                    // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                    // Why?  Good question...
                    if (recording) {
                        if (RECORD_LENGTH <= 0) try {
                            recorderStreaming.recordSamples(audioData);
                            recorderSDCard.recordSamples(audioData);
                            //Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                        } catch (FFmpegFrameRecorder.Exception e) {
                            Log.v(LOG_TAG, e.getMessage());
                            e.printStackTrace();
                        }
                    }
                }
            }
            Log.v(LOG_TAG, "AudioThread Finished, release audioRecord");

            /* encoding finish, release recorderStreaming */
            if (audioRecord != null) {
                audioRecord.stop();
                audioRecord.release();
                audioRecord = null;
                Log.v(LOG_TAG, "audioRecord released");
            }
        }
    }

    //---------------------------------------------
    // camera thread, gets and encodes video data
    //---------------------------------------------
    class CameraView extends SurfaceView implements SurfaceHolder.Callback, Camera.PreviewCallback {

        private SurfaceHolder mHolder;
        private Camera mCamera;

        public CameraView(Context context, Camera camera) {
            super(context);
            Log.w("camera", "camera view");
            mCamera = camera;
            mHolder = getHolder();
            mHolder.addCallback(CameraView.this);
            mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
            mCamera.setPreviewCallback(CameraView.this);
        }

        @Override
        public void surfaceCreated(SurfaceHolder holder) {
            try {
                stopPreview();
                mCamera.setPreviewDisplay(holder);
            } catch (IOException exception) {
                mCamera.release();
                mCamera = null;
            }
        }

        public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
            Camera.Parameters camParams = mCamera.getParameters();
            List<Camera.Size> sizes = camParams.getSupportedPreviewSizes();
            // Sort the list in ascending order
            Collections.sort(sizes, new Comparator<Camera.Size>() {

                public int compare(final Camera.Size a, final Camera.Size b) {
                    return a.width * a.height - b.width * b.height;
                }
            });

            // Pick the first preview size that is equal or bigger, or pick the last (biggest) option if we cannot
            // reach the initial settings of imageWidth/imageHeight.
            for (int i = 0; i < sizes.size(); i++) {
                if ((sizes.get(i).width >= imageWidth && sizes.get(i).height >= imageHeight) || i == sizes.size() - 1) {
                    imageWidth = sizes.get(i).width;
                    imageHeight = sizes.get(i).height;
                    Log.v(LOG_TAG, "Changed to supported resolution: " + imageWidth + "x" + imageHeight);
                    break;
                }
            }
            camParams.setPreviewSize(imageWidth, imageHeight);

            Log.v(LOG_TAG, "Setting imageWidth: " + imageWidth + " imageHeight: " + imageHeight + " videoFrameRate: " + videoFrameRate);

            camParams.setPreviewFrameRate(videoFrameRate);
            Log.v(LOG_TAG, "Preview Framerate: " + camParams.getPreviewFrameRate());

            mCamera.setParameters(camParams);
            startPreview();
        }

        @Override
        public void surfaceDestroyed(SurfaceHolder holder) {
            try {
                mHolder.addCallback(null);
                mCamera.setPreviewCallback(null);
            } catch (RuntimeException e) {
                // The camera has probably just been released, ignore.
            }
        }

        public void startPreview() {
            if (!isPreviewOn && mCamera != null) {
                isPreviewOn = true;
                mCamera.startPreview();
            }
        }

        public void stopPreview() {
            if (isPreviewOn && mCamera != null) {
                isPreviewOn = false;
                mCamera.stopPreview();
            }
        }

        @Override
        public void onPreviewFrame(byte[] data, Camera camera) {
            if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
                startTime = System.currentTimeMillis();
                return;
            }
            if (RECORD_LENGTH > 0) {
                int i = imagesIndex++ % images.length;
                yuvImage = images[i];
                timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
            }
            /* get video data */
            if (yuvImage != null && recording) {
                ((ByteBuffer) yuvImage.image[0].position(0)).put(data);

                if (RECORD_LENGTH <= 0) try {
                    Log.v(LOG_TAG, "Writing Frame");
                    long t = 1000 * (System.currentTimeMillis() - startTime);
                    if (t > recorderStreaming.getTimestamp()) {
                        recorderStreaming.setTimestamp(t);
                    }
                    recorderStreaming.record(yuvImage);
                    if (t > recorderSDCard.getTimestamp()) {
                        recorderSDCard.setTimestamp(t);
                    }
                    recorderSDCard.record(yuvImage);
                } catch (FFmpegFrameRecorder.Exception e) {
                    Log.v(LOG_TAG, e.getMessage());
                    e.printStackTrace();
                }
            }
        }
    }

    @Override
    public void onClick(View v) {
        if (!recording) {
            startRecording();
            Log.w(LOG_TAG, "Start Button Pushed");
            btnRecorderControl.setText("Stop");
        } else {
            // This will trigger the audio recording loop to stop and then set isRecorderStart = false;
            stopRecording();
            Log.w(LOG_TAG, "Stop Button Pushed");
            btnRecorderControl.setText("Start");
        }
    }
}

If I use ffmpeg command, I can stream video, and I can play it with ffplay.

Someone can help me?
Thanks in advance

@saudet
Copy link
Member

saudet commented Nov 6, 2015

Could you try with an unmodified version of the RecordActivity.java sample?

If you still get an error with the original version, could you call FFmpegLogCallback.set() and copy/paste what you get in the Android log? Thanks

@jmbernal
Copy link
Author

Thanks, saudet.

Original RecordActivity.java? it write video on the SDCard, but I need to send to ffserver. If I change 'ffmpeg_link= "/mnt/sdcard/stream.flv"' for 'ffmpeg_link= "http://192.168.26.162:8090/feed1.ffm"', it doesn't work either. Now, I only have error pressing Stop button, like this:

11-10 15:54:53.969 30796-30796/org.bytedeco.javacv.recordactivity W/System.err: org.bytedeco.javacv.FrameRecorder$Exception: av_interleaved_write_frame() error -104 while writing interleaved audio frame.
11-10 15:54:53.969 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at org.bytedeco.javacv.FFmpegFrameRecorder.record(FFmpegFrameRecorder.java:970)
11-10 15:54:53.969 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at org.bytedeco.javacv.FFmpegFrameRecorder.recordSamples(FFmpegFrameRecorder.java:935)
11-10 15:54:53.969 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at org.bytedeco.javacv.FFmpegFrameRecorder.recordSamples(FFmpegFrameRecorder.java:808)
11-10 15:54:53.979 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at org.bytedeco.javacv.recordactivity.RecordActivity.stopRecording(RecordActivity.java:253)
11-10 15:54:53.979 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at org.bytedeco.javacv.recordactivity.RecordActivity.onClick(RecordActivity.java:482)
11-10 15:54:53.979 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at android.view.View.performClick(View.java:4640)
11-10 15:54:53.979 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at android.view.View$PerformClick.run(View.java:19425)
11-10 15:54:53.979 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at android.os.Handler.handleCallback(Handler.java:733)
11-10 15:54:53.979 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at android.os.Handler.dispatchMessage(Handler.java:95)
11-10 15:54:53.979 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at android.os.Looper.loop(Looper.java:146)
11-10 15:54:53.979 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at android.app.ActivityThread.main(ActivityThread.java:5593)
11-10 15:54:53.979 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at java.lang.reflect.Method.invokeNative(Native Method)
11-10 15:54:53.979 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at java.lang.reflect.Method.invoke(Method.java:515)
11-10 15:54:53.979 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1283)
11-10 15:54:53.979 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1099)
11-10 15:54:53.979 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at dalvik.system.NativeStart.main(Native Method)
11-10 15:54:53.989 30796-30796/org.bytedeco.javacv.recordactivity W/System.err: Warning: [aac @ 0x632372f0] 1 frames left in the queue on closing
11-10 15:54:53.989 30796-30796/org.bytedeco.javacv.recordactivity W/System.err: org.bytedeco.javacv.FrameRecorder$Exception: av_interleaved_write_frame() error -104 while writing interleaved audio frame.
11-10 15:54:53.989 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at org.bytedeco.javacv.FFmpegFrameRecorder.record(FFmpegFrameRecorder.java:970)
11-10 15:54:53.989 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at org.bytedeco.javacv.FFmpegFrameRecorder.recordSamples(FFmpegFrameRecorder.java:938)
11-10 15:54:53.989 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at org.bytedeco.javacv.FFmpegFrameRecorder.stop(FFmpegFrameRecorder.java:662)
11-10 15:54:53.989 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at org.bytedeco.javacv.recordactivity.RecordActivity.stopRecording(RecordActivity.java:264)
11-10 15:54:53.989 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at org.bytedeco.javacv.recordactivity.RecordActivity.onClick(RecordActivity.java:482)
11-10 15:54:53.989 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at android.view.View.performClick(View.java:4640)
11-10 15:54:53.989 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at android.view.View$PerformClick.run(View.java:19425)
11-10 15:54:53.989 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at android.os.Handler.handleCallback(Handler.java:733)
11-10 15:54:53.989 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at android.os.Handler.dispatchMessage(Handler.java:95)
11-10 15:54:53.999 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at android.os.Looper.loop(Looper.java:146)
11-10 15:54:53.999 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at android.app.ActivityThread.main(ActivityThread.java:5593)
11-10 15:54:53.999 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at java.lang.reflect.Method.invokeNative(Native Method)
11-10 15:54:53.999 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at java.lang.reflect.Method.invoke(Method.java:515)
11-10 15:54:53.999 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1283)
11-10 15:54:53.999 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1099)
11-10 15:54:53.999 30796-30796/org.bytedeco.javacv.recordactivity W/System.err:     at dalvik.system.NativeStart.main(Native Method)
11-10 15:54:53.999 30796-30796/org.bytedeco.javacv.recordactivity W/RecordActivity: Stop Button Pushed

Previously, this is the log until I press Start button:

11-10 15:54:47.463 30796-30796/org.bytedeco.javacv.recordactivity I/RecordActivity: ffmpeg_url: http://192.168.26.162:8090/live.flv
11-10 15:54:47.493 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Trying to load lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libswresample.so 0x420bde50
11-10 15:54:47.493 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Added shared lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libswresample.so 0x420bde50
11-10 15:54:47.493 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: No JNI_OnLoad found in /data/app-lib/org.bytedeco.javacv.recordactivity-1/libswresample.so 0x420bde50, skipping init
11-10 15:54:47.493 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Trying to load lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libjniswresample.so 0x420bde50
11-10 15:54:47.493 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Added shared lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libjniswresample.so 0x420bde50
11-10 15:54:47.533 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Trying to load lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libavcodec.so 0x420bde50
11-10 15:54:47.533 30796-30796/org.bytedeco.javacv.recordactivity W/linker: libavcodec.so has text relocations. This is wasting memory and is a security risk. Please fix.
11-10 15:54:47.533 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Added shared lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libavcodec.so 0x420bde50
11-10 15:54:47.533 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: No JNI_OnLoad found in /data/app-lib/org.bytedeco.javacv.recordactivity-1/libavcodec.so 0x420bde50, skipping init
11-10 15:54:47.533 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Trying to load lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libjniavcodec.so 0x420bde50
11-10 15:54:47.533 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Added shared lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libjniavcodec.so 0x420bde50
11-10 15:54:47.633 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Trying to load lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libavformat.so 0x420bde50
11-10 15:54:47.633 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Added shared lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libavformat.so 0x420bde50
11-10 15:54:47.633 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: No JNI_OnLoad found in /data/app-lib/org.bytedeco.javacv.recordactivity-1/libavformat.so 0x420bde50, skipping init
11-10 15:54:47.633 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Trying to load lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libjniavformat.so 0x420bde50
11-10 15:54:47.643 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Added shared lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libjniavformat.so 0x420bde50
11-10 15:54:47.693 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Trying to load lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libswscale.so 0x420bde50
11-10 15:54:47.693 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Added shared lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libswscale.so 0x420bde50
11-10 15:54:47.693 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: No JNI_OnLoad found in /data/app-lib/org.bytedeco.javacv.recordactivity-1/libswscale.so 0x420bde50, skipping init
11-10 15:54:47.693 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Trying to load lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libjniswscale.so 0x420bde50
11-10 15:54:47.693 30796-30796/org.bytedeco.javacv.recordactivity D/dalvikvm: Added shared lib /data/app-lib/org.bytedeco.javacv.recordactivity-1/libjniswscale.so 0x420bde50
11-10 15:54:47.723 30796-30796/org.bytedeco.javacv.recordactivity I/RecordActivity: recorder initialize success
11-10 15:54:47.793 30796-30796/org.bytedeco.javacv.recordactivity I/System.out: Output #0, flv, to 'http://192.168.26.162:8090/live.flv':
11-10 15:54:47.793 30796-30796/org.bytedeco.javacv.recordactivity I/System.out:     Stream #0:0
11-10 15:54:47.803 30796-30796/org.bytedeco.javacv.recordactivity I/System.out: : Video: flv1 (flv), yuv420p, 320x240, q=2-31, 400 kb/s
11-10 15:54:47.803 30796-30796/org.bytedeco.javacv.recordactivity I/System.out: , 
11-10 15:54:47.803 30796-30796/org.bytedeco.javacv.recordactivity I/System.out: 30 tbn, 
11-10 15:54:47.803 30796-30796/org.bytedeco.javacv.recordactivity I/System.out: 30 tbc
11-10 15:54:47.803 30796-30796/org.bytedeco.javacv.recordactivity I/System.out:     Stream #0:1
11-10 15:54:47.803 30796-30796/org.bytedeco.javacv.recordactivity I/System.out: : Audio: aac, 44100 Hz, mono, fltp, 64 kb/s
11-10 15:54:47.853 30796-30796/org.bytedeco.javacv.recordactivity W/RecordActivity: Start Button Pushed
11-10 15:54:47.913 30796-31353/org.bytedeco.javacv.recordactivity D/dalvikvm: GC_FOR_ALLOC freed 1974K, 6% free 54960K/58092K, paused 35ms, total 35ms
11-10 15:54:47.913 30796-31353/org.bytedeco.javacv.recordactivity D/RecordActivity: audioRecord.startRecording()

Any idea?

@jmbernal
Copy link
Author

I use too ffserver -d and ffplay http://192.168.26.162:8090/live.flv. After pressing Stop button, if I close server, ffplay windows shows:
http://192.168.26.162:8090/live.flv: Invalid data found when processing input

@saudet
Copy link
Member

saudet commented Nov 11, 2015

It looks like your app isn't writing enough audio frames or enough video frames. This is only a guess, but it may help to write a dummy video frame just before calling stop().

@jmbernal
Copy link
Author

I have found one error in my code. I was streaming to http://192.168.26.162:8090/live.flv and I have to use http://192.168.26.162:8090/feed1.ffm. Now, the log is a bit different.
If recorder.setFormat("mpeg")`

11-11 09:38:26.815 9116-9116/org.bytedeco.javacv.recordactivity I/RecordActivity: recorder initialize success
11-11 09:38:26.895 9116-9116/org.bytedeco.javacv.recordactivity I/System.out: Output #0, mpeg, to 'http://192.168.26.162:8090/feed1.ffm':
11-11 09:38:26.895 9116-9116/org.bytedeco.javacv.recordactivity I/System.out:     Stream #0:0
11-11 09:38:26.895 9116-9116/org.bytedeco.javacv.recordactivity I/System.out: : Video: mpeg1video, yuv420p, 320x240, q=2-31, 400 kb/s
11-11 09:38:26.895 9116-9116/org.bytedeco.javacv.recordactivity I/System.out: , 
11-11 09:38:26.895 9116-9116/org.bytedeco.javacv.recordactivity I/System.out: 30 tbn, 
11-11 09:38:26.895 9116-9116/org.bytedeco.javacv.recordactivity I/System.out: 30 tbc
11-11 09:38:26.895 9116-9116/org.bytedeco.javacv.recordactivity I/System.out:     Stream #0:1
11-11 09:38:26.895 9116-9116/org.bytedeco.javacv.recordactivity I/System.out: : Audio: mp2, 44100 Hz, mono, s16, 64 kb/s
11-11 09:38:26.985 9116-9116/org.bytedeco.javacv.recordactivity W/System.err: Warning: [mpeg @ 0x63a648a0] VBV buffer size not set, using default size of 130KB
11-11 09:38:26.985 9116-9116/org.bytedeco.javacv.recordactivity W/System.err: If you want the mpeg file to be compliant to some specification
11-11 09:38:26.985 9116-9116/org.bytedeco.javacv.recordactivity W/System.err: Like DVD, VCD or others, make sure you set the correct buffer size
11-11 09:38:26.985 9116-9116/org.bytedeco.javacv.recordactivity W/RecordActivity: Start Button Pushed
11-11 09:38:27.055 9116-9130/org.bytedeco.javacv.recordactivity D/dalvikvm: GC_FOR_ALLOC freed 1975K, 6% free 53271K/56404K, paused 43ms, total 43ms
11-11 09:38:27.065 9116-11941/org.bytedeco.javacv.recordactivity D/RecordActivity: audioRecord.startRecording()

In ffplay window, I can see:

http://192.168.26.162:8090/live.mpeg: Invalid data found when processing input

If recorder.setFormat("flv")`

11-11 09:46:13.022 17900-17900/org.bytedeco.javacv.recordactivity I/RecordActivity: recorder initialize success
11-11 09:46:13.092 17900-17900/org.bytedeco.javacv.recordactivity I/System.out: Output #0, flv, to 'http://192.168.26.162:8090/feed1.ffm':
11-11 09:46:13.092 17900-17900/org.bytedeco.javacv.recordactivity I/System.out:     Stream #0:0
11-11 09:46:13.092 17900-17900/org.bytedeco.javacv.recordactivity I/System.out: : Video: flv1 (flv), yuv420p, 320x240, q=2-31, 400 kb/s
11-11 09:46:13.092 17900-17900/org.bytedeco.javacv.recordactivity I/System.out: , 
11-11 09:46:13.092 17900-17900/org.bytedeco.javacv.recordactivity I/System.out: 30 tbn, 
11-11 09:46:13.092 17900-17900/org.bytedeco.javacv.recordactivity I/System.out: 30 tbc
11-11 09:46:13.092 17900-17900/org.bytedeco.javacv.recordactivity I/System.out:     Stream #0:1
11-11 09:46:13.092 17900-17900/org.bytedeco.javacv.recordactivity I/System.out: : Audio: aac, 44100 Hz, mono, fltp, 64 kb/s
11-11 09:46:13.132 17900-17900/org.bytedeco.javacv.recordactivity W/RecordActivity: Start Button Pushed
11-11 09:46:13.132 17900-17900/org.bytedeco.javacv.recordactivity I/Choreographer: Skipped 64 frames!  The application may be doing too much work on its main thread.
11-11 09:46:13.162 17900-19069/org.bytedeco.javacv.recordactivity D/RecordActivity: audioRecord.startRecording()

In ffplay window, I can see:

[flv @ 0x7f19e80008c0] Could not find codec parameters for stream 0 (Video: none, none, 64 kb/s): unknown codec
Consider increasing the value for the 'analyzeduration' and 'probesize' options
[flv @ 0x7f19e80008c0] Could not find codec parameters for stream 1 (Audio: none, 0 channels, 64 kb/s): unknown codec
Consider increasing the value for the 'analyzeduration' and 'probesize' options
http://192.168.26.162:8090/live.flv: could not find codec parameters

What way I have to choose?
Thx

@saudet
Copy link
Member

saudet commented Nov 12, 2015

Do you get the same thing when using ffmpeg on the command line?

@jmbernal
Copy link
Author

Absolutely not. If I use ffserver -d -f /etc/ffserver.conf, ffmpeg -f v4l2 -s 320x240 -r 25 -i /dev/video0 -f alsa -ac 2 -i hw:0 http://localhost:8090/feed1.ffm and ffplay http://localhost:8090/live.flv, I can see correctly the video.

@saudet
Copy link
Member

saudet commented Nov 13, 2015

So let's see. Which codecs is ffmpeg using in that case? Have you tried to use those same audio and video codecs with FFmpegFrameRecorder? And have you tried to setFormat("ffm")?

Also, does the same thing happen with FFmpegFrameRecorder on the desktop? Or does it happen only on Android?

@jmbernal
Copy link
Author

The codecs in the ffserver.conf? Always are the same:

<Stream live.mpeg>
    Feed feed1.ffm
    Format ffm
......
    AudioCodec mp2
    VideoCodec mpeg2video
......

Now, I tried to use recorder.setFormat("ffm"). I need to use VideoCodec mpeg1video. I can see just the first seconds of the video. Server window shows:

Mon Nov 16 14:53:11 2015 192.168.26.211 - - [POST] "/feed1.ffm HTTP/1.1" 200 667648
Mon Nov 16 14:53:11 2015 [mpeg @ 0x3b14af0]Application provided invalid, non monotonically increasing dts to muxer in stream 0: 900000 >= 858000
Mon Nov 16 14:53:11 2015 Error writing frame to output for stream 'live.mpeg': Invalid argument

And ffplay window shows:

[mpeg2video @ 0x7f9a9c004de0] Invalid frame dimensions 0x0. f=0/0   
    Last message repeated 1 times
Input #0, mpeg, from 'http://192.168.26.162:8090/live.mpeg':f=0/0   
  Duration: N/A, start: 0.000000, bitrate: 104921 kb/s
    Stream #0:0[0x1e0]: Video: mpeg1video, yuv420p(tv), 320x240 [SAR 1:1 DAR 4:3], 104857 kb/s, 30 fps, 30 tbr, 90k tbn, 30 tbc
    Stream #0:1[0x1c0]: Audio: mp2, 44100 Hz, mono, s16p, 64 kb/s
[mpeg1video @ 0x7f9a9c004de0] ac-tex damaged at 0 10=    0B f=0/0   
[mpeg1video @ 0x7f9a9c004de0] Warning MVs not available
[mpeg1video @ 0x7f9a9c004de0] concealing 168 DC, 168 AC, 168 MV errors in P frame

now, it's very close!!!

@jmbernal jmbernal reopened this Nov 16, 2015
@saudet
Copy link
Member

saudet commented Nov 17, 2015

You might want to try with a codec that is a bit more resistant to corrupted data than mpeg2video, such as h264 maybe...

@jmbernal
Copy link
Author

If I use VideoCodec h264 in ffserver.conf, when I start ffserver, I get this error:

/etc/ffserver.conf:33: Invalid codec name: 'h264'
/etc/ffserver.conf:40: Setting default value for audio bit rate = 64000. Use NoDefaults to disable it.
/etc/ffserver.conf:40: Setting default value for video bit rate tolerance = 21333. Use NoDefaults to disable it.
/etc/ffserver.conf:40: Setting default value for video rate control equation = tex^qComp. Use NoDefaults to disable it.
/etc/ffserver.conf:40: Setting default value for video max rate = 128000. Use NoDefaults to disable it.
Error reading configuration file '/etc/ffserver.conf': Invalid argument

So, I'm trying with h263 or mpeg1video, it occurs strange things: I start ffplay and nothing happends. But, if I stop streaming from android, suddenly it is opened a video window (from ffplay) and I can see a few seconds of the video, not full video. This window shows no error.

In ffserver window, I can see:

Wed Nov 18 11:41:38 2015 192.168.26.211 - - New connection: POST /feed1.ffm
Wed Nov 18 11:41:48 2015 192.168.26.162 - - New connection: GET /live.mpeg
Wed Nov 18 11:41:48 2015 [mpeg @ 0x3902690]VBV buffer size not set, using default size of 130KB
If you want the mpeg file to be compliant to some specification
Like DVD, VCD or others, make sure you set the correct buffer size

@saudet
Copy link
Member

saudet commented Nov 19, 2015

Well, you should use a version of ffserver that supports h264.

@jmbernal
Copy link
Author

This is my ffserver -version result:

ffserver version N-76045-g97be5d4 Copyright (c) 2000-2015 the FFmpeg developers
built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04)
configuration: --extra-libs=-ldl --prefix=/opt/ffmpeg --enable-avresample --disable-debug --enable-nonfree --enable-gpl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse --enable-libdcadec --enable-libfreetype --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth --enable-libsoxr --enable-libxvid --enable-libvo-aacenc --enable-libvidstab
libavutil      55.  4.100 / 55.  4.100
libavcodec     57.  6.100 / 57.  6.100
libavformat    57.  4.100 / 57.  4.100
libavdevice    57.  0.100 / 57.  0.100
libavfilter     6. 11.100 /  6. 11.100
libavresample   3.  0.  0 /  3.  0.  0
libswscale      4.  0.100 /  4.  0.100
libswresample   2.  0.100 /  2.  0.100
libpostproc    54.  0.100 / 54.  0.100

Which version should I use?

@saudet
Copy link
Member

saudet commented Nov 19, 2015

Maybe it wants libx264 as "codec name"?

@jmbernal
Copy link
Author

I do not understand. You mean VideoCodec libx264in fserver.conf`? If yes, I have tried and have the same result (short video after closing streaming from android).

@saudet
Copy link
Member

saudet commented Nov 20, 2015

So, what error message do you get in that case in Android?

@jmbernal
Copy link
Author

Sorry, saudet, but I can not work on it anymore.
Thanks for your help.

@saudet
Copy link
Member

saudet commented Nov 26, 2015

No problem! Thanks for your time. I hope we can find a way to make this work better eventually. :)

@saudet
Copy link
Member

saudet commented Nov 26, 2015

One last thing I thought about though, according to ffserver's documentation, the "ffm" format is not compatible between different versions of FFmpeg, so we might want to try the "ffm2" format instead, which is more stable: https://www.ffmpeg.org/ffserver.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants