Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Turning On/Off Audio recording while Streaming #19

Closed
vaibhavbparikh opened this issue Aug 9, 2014 · 13 comments
Closed

Turning On/Off Audio recording while Streaming #19

vaibhavbparikh opened this issue Aug 9, 2014 · 13 comments
Labels

Comments

@vaibhavbparikh
Copy link

Hi,

I am streaming video android to android with this library and it works great. However I am facing an issue in streaming only video without audio with RTMP.

In Record Activity , when I tries to stop audio recording with a flag and then again change value of flag and start audio recording , at that time I find streaming is not sync between audio and video and video streaming is having too much delay while audio stream is live.And both have too much glitches.

Any help would be much appreciated.

Following is my code

public class MainActivity extends Activity implements OnClickListener {

    private final static String LOG_TAG = "MainActivity";

    private PowerManager.WakeLock mWakeLock;

    private String ffmpeg_link = "rtmp://192.168.2.220:1935/videochat/vb";

    private volatile FFmpegFrameRecorder recorder;
    boolean recording = false;
    long startTime = 0;

    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 320;
    private int imageHeight = 240;
    private int frameRate = 30;

    private Thread audioThread;
    volatile boolean runAudioThread = true;
    private volatile AudioRecord audioRecord;
    private AudioRecordRunnable audioRecordRunnable;

    private CameraView cameraView;
    private IplImage yuvIplimage = null;

    private Button recordButton, audio_control;
    private LinearLayout mainLayout;

    private volatile boolean audioStatus = true;



    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

        setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
        setContentView(R.layout.activity_main);

        initLayout();
        initRecorder();
    }

    @Override
    protected void onResume() {
        super.onResume();

        if (mWakeLock == null) {
            PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
            mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, LOG_TAG);
            mWakeLock.acquire();
        }
    }

    @Override
    protected void onPause() {
        super.onPause();

        if (mWakeLock != null) {
            mWakeLock.release();
            mWakeLock = null;
        }
    }

    @Override
    protected void onDestroy() {
        super.onDestroy();

        recording = false;
    }

    private void initLayout() {

        mainLayout = (LinearLayout) this.findViewById(R.id.record_layout);

        recordButton = (Button) findViewById(R.id.recorder_control);
        audio_control = (Button) findViewById(R.id.audio_control);

        audio_control.setText("off");
        recordButton.setText("Start");

        audio_control.setOnClickListener(new OnClickListener() {

            @Override
            public void onClick(View v) {
                if (audio_control.getText().toString().equals("off")) {
                    audioStatus = false;
                    audio_control.setText("on");
                } else {
                    startTime = System.currentTimeMillis();
                    audioStatus = true;
                    audio_control.setText("off");
                }

            }
        });
        recordButton.setOnClickListener(this);

        cameraView = new CameraView(this);

        LinearLayout.LayoutParams layoutParam = new LinearLayout.LayoutParams(imageWidth, imageHeight);
        mainLayout.addView(cameraView, layoutParam);
        Log.v(LOG_TAG, "added cameraView to mainLayout");
    }

    private void initRecorder() {
        Log.w(LOG_TAG, "initRecorder");

        if (yuvIplimage == null) {
            yuvIplimage = IplImage.create(imageWidth, imageHeight, opencv_core.IPL_DEPTH_8U, 2);
            Log.v(LOG_TAG, "IplImage.create");
        }

        recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
        Log.v(LOG_TAG, "FFmpegFrameRecorder: " + ffmpeg_link + " imageWidth: " + imageWidth + " imageHeight " + imageHeight);
        recorder.setInterleaved(false);
        recorder.setFormat("flv");
        Log.v(LOG_TAG, "recorder.setFormat(\"flv\")");

        recorder.setSampleRate(sampleAudioRateInHz);
        Log.v(LOG_TAG, "recorder.setSampleRate(sampleAudioRateInHz)");

        // re-set in the surface changed method as well
        recorder.setFrameRate(frameRate);
        Log.v(LOG_TAG, "recorder.setFrameRate(frameRate)");

        // Create audio recording thread
        audioRecordRunnable = new AudioRecordRunnable();
        audioThread = new Thread(audioRecordRunnable);
    }

    // Start the capture
    public void startRecording() {
        try {
            recorder.start();
            startTime = System.currentTimeMillis();
            recording = true;
            audioThread.start();
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    public void stopRecording() {
        // This should stop the audio thread from running
        runAudioThread = false;

        if (recorder != null && recording) {
            recording = false;
            Log.v(LOG_TAG, "Finishing recording, calling stop and release on recorder");
            try {
                recorder.stop();
                recorder.release();
            } catch (Exception e) {
                e.printStackTrace();
            }
            recorder = null;
        }
    }

    @Override
    public boolean onKeyDown(int keyCode, KeyEvent event) {
        // Quit when back button is pushed
        if (keyCode == KeyEvent.KEYCODE_BACK) {
            if (recording) {
                stopRecording();
            }
            finish();
            return true;
        }
        return super.onKeyDown(keyCode, event);
    }

    @Override
    public void onClick(View v) {
        if (!recording) {
            startRecording();
            Log.w(LOG_TAG, "Start Button Pushed");
            recordButton.setText("Stop");
        } else {
            stopRecording();
            Log.w(LOG_TAG, "Stop Button Pushed");
            recordButton.setText("Start");
        }
    }

    // ---------------------------------------------
    // audio thread, gets and encodes audio data
    // ---------------------------------------------
    class AudioRecordRunnable implements Runnable {

        @SuppressWarnings("deprecation")
        @Override
        public void run() {
            // Set the thread priority
            android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

            // Audio
            int bufferSize;
            short[] audioData;
            int bufferReadResult;

            bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
            audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz, AudioFormat.CHANNEL_CONFIGURATION_MONO,
                    AudioFormat.ENCODING_PCM_16BIT, bufferSize);

            audioData = new short[bufferSize];

            Log.d(LOG_TAG, "audioRecord.startRecording()");
            audioRecord.startRecording();

            // Audio Capture/Encoding Loop
            while (runAudioThread) {
                if (audioStatus) {
                    // Read from audioRecord
                    bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
                    if (bufferReadResult > 0) {
                        // Log.v(LOG_TAG,"audioRecord bufferReadResult: " +
                        // bufferReadResult);
                        try {
                            // Write to FFmpegFrameRecorder
                            Buffer[] buffer = { ShortBuffer.wrap(audioData, 0, bufferReadResult) };
                            recorder.record(buffer);
                        } catch (Exception e) {
                            Log.v(LOG_TAG, e.getMessage());
                            e.printStackTrace();
                        }
                    }
                }
            }

            Log.v(LOG_TAG, "AudioThread Finished");

            /* Capture/Encoding finished, release recorder */
            if (audioRecord != null) {
                audioRecord.stop();
                audioRecord.release();
                audioRecord = null;
                Log.v(LOG_TAG, "audioRecord released");
            }
        }
    }
    class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

        private boolean previewRunning = false;

        private SurfaceHolder holder;
        private Camera camera;

        private byte[] previewBuffer;

        long videoTimestamp = 0;

        Bitmap bitmap;
        Canvas canvas;

        public CameraView(Context _context) {
            super(_context);

            holder = this.getHolder();
            holder.addCallback(this);
            holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
        }

        @Override
        public void surfaceCreated(SurfaceHolder holder) {
            camera = Camera.open();

            try {
                camera.setPreviewDisplay(holder);
                camera.setPreviewCallback(this);

                Camera.Parameters currentParams = camera.getParameters();
                Log.v(LOG_TAG, "Preview Framerate: " + currentParams.getPreviewFrameRate());
                Log.v(LOG_TAG, "Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: "
                        + currentParams.getPreviewSize().height);

                // Use these values
                imageWidth = currentParams.getPreviewSize().width;
                imageHeight = currentParams.getPreviewSize().height;
                frameRate = currentParams.getPreviewFrameRate();

                bitmap = Bitmap.createBitmap(imageWidth, imageHeight, Bitmap.Config.ALPHA_8);


                camera.startPreview();
                previewRunning = true;
            } catch (IOException e) {
                Log.v(LOG_TAG, e.getMessage());
                e.printStackTrace();
            }
        }

        public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
            Log.v(LOG_TAG, "Surface Changed: width " + width + " height: " + height);


            // Get the current parameters
            Camera.Parameters currentParams = camera.getParameters();
            Log.v(LOG_TAG, "Preview Framerate: " + currentParams.getPreviewFrameRate());
            Log.v(LOG_TAG, "Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

            // Use these values
            imageWidth = currentParams.getPreviewSize().width;
            imageHeight = currentParams.getPreviewSize().height;
            frameRate = currentParams.getPreviewFrameRate();

            // Create the yuvIplimage if needed
            yuvIplimage = IplImage.create(imageWidth, imageHeight, opencv_core.IPL_DEPTH_8U, 2);
        }

        @Override
        public void surfaceDestroyed(SurfaceHolder holder) {
            try {
                camera.setPreviewCallback(null);

                previewRunning = false;
                camera.release();

            } catch (RuntimeException e) {
                Log.v(LOG_TAG, e.getMessage());
                e.printStackTrace();
            }
        }

        @Override
        public void onPreviewFrame(byte[] data, Camera camera) {

            if (yuvIplimage != null && recording) {
                videoTimestamp = 1000 * (System.currentTimeMillis() - startTime);

                yuvIplimage.getByteBuffer().put(data);
                try {
                    // Get the correct time
                    recorder.setTimestamp(videoTimestamp);

                    // Record the image into FFmpegFrameRecorder
                    recorder.record(yuvIplimage);

                } catch (Exception e) {
                    Log.v(LOG_TAG, e.getMessage());
                    e.printStackTrace();
                }
            }
        }
    }
}
@saudet saudet added the question label Aug 9, 2014
@saudet
Copy link
Member

saudet commented Aug 9, 2014

When you want to "stop audio recording", instead of actually stopping, simply keep recording, but record silence. That's the easiest way I see to accomplish that.

@saudet saudet closed this as completed Aug 9, 2014
@vaibhavbparikh
Copy link
Author

Hi Samuel,

Thanks for the help.

I have already tried that scenario.

In that case we are getting array of 0 for audioData and bufferedResult is also 0 , so it will not record those bytes.

Also we have tried to remove if(bufferedResult > 0) from audioThread but no of use.

It streams well in starting with audioStatus true , and also when we change the audio status to false (without audio.)

But when I again change the audioStatus value to true , then it will go into if(bufferedresult >0 ) condition and record the audio with FFMpegFramerecorder object , but will not stream well.

It will have a lot of delay in video with a lot of distortion. And no Audio at all.

Please try this scenario.

Thanks in Advance.

@saudet
Copy link
Member

saudet commented Aug 9, 2014

That sounds like a problem with the Android API, does it not?

@vaibhavbparikh
Copy link
Author

I think not when you try to send mute data recorder accepts those values but when we again send the audio data then it doesn't behave properly. So there must be some mechanism which do not like the real audio data then and misbehaves.

Following things I could not understand :

  1. When sending audio data again why it is affecting video frames as well.
  2. I am not getting any kind of exceptions or errors as well from the library.

Any hint or help would be much appreciated.

@saudet
Copy link
Member

saudet commented Aug 9, 2014

Like I said, do not stop "sending audio data", and that's it: It will work. The audio data acts as a timer, so it is not recommended to stop it. So, just do not stop sending audio data, and that's it. Please try it.

@saudet
Copy link
Member

saudet commented Aug 10, 2014

Here's another option to manually recording silence: Set the volume to 0 through the Android API, and keep recording, of course.

@vaibhavbparikh
Copy link
Author

Hey Samuel , Thanks for your inputs , when you will set volume to 0 , it will set device speaker on silent mode and not device microphone , so it will keep recording data what is there coming from Mic !

By the way i found solution to this !

In FFMpegFrameRecorder , record(AVFrame frame) we are not sending Audio so I out condition as follows :

/* write the compressed frame in the media file */
        synchronized (oc) {
            if(isAudio){
                if (interleaved && video_st != null) {
                    if ((ret = av_interleaved_write_frame(oc, audio_pkt)) < 0) {
                        throw new Exception("av_interleaved_write_frame() error " + ret + " while writing interleaved audio frame.");
                    }
                } else {
                    if ((ret = av_write_frame(oc, audio_pkt)) < 0) {
                        throw new Exception("av_write_frame() error " + ret + " while writing audio frame.");
                    }
                }
            }
        }

Hope that helps some one facing same issue

@vaibhavbparikh
Copy link
Author

In the same case when initially I initially(before starting streaming) make isAudio false , then it is creating issue.

Hope I will find solution to this as well. See if you can reproduce the same.

Thanks

@saudet
Copy link
Member

saudet commented Aug 11, 2014

Can you find out if this solution is officially supported by the FLV format? If it is, then we'll add this change to FFmpegFrameRecorder, thanks! (I don't think it is though, so you should simply record samples full of zeros as previously explained: It's a simple solution, and it works for sure with any format. So please stop trying to think about this too much and do it the simple way, thank you.)

@IgeNiaI
Copy link

IgeNiaI commented Nov 17, 2014

Here is my solution for this:

audioData = new short[bufferSize];
...
if (!recordingSound) {
    for (int i = 0; i < audioData.length; i++) {
        audioData[i] = 0;
    }
}

Works pretty well and doesn't seems to be too slow.

@PRB16
Copy link

PRB16 commented Mar 17, 2016

@vaibhavbparikh Did you get the solution for this issue?Can you please post your solution.

@rayalarajee
Copy link

Hi,,

Did anybody found the soundtion please resond i had struked from a week
Thanks in advance

@balbelias
Copy link

balbelias commented Dec 21, 2016

Solved it using Android API.

    public void toggleAudio(boolean enable) {
        AudioManager audioManager = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
        if (audioManager.isMicrophoneMute() == !enable) {
            if (enable) {
                audioManager.setMicrophoneMute(false);
            } else {
                audioManager.setMicrophoneMute(true);
            }
        }
    }

Muting microphone will make AudioRecord produce zeroed buffers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants