在Android中使用WebRTC保存视频文件时遇到问题 [英] Trouble saving a video file with webrtc in Android

查看:1012
本文介绍了在Android中使用WebRTC保存视频文件时遇到问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在开发一个基于webrtc的视频聊天应用程序,当前正在进行视频通话,但是我想使用VideoFileRenderer从远程视频流中录制视频,例如,该接口有许多实现: https://chromium.googlesource.com/外部/webrtc/+/master/sdk/android/api/org/webrtc/VideoFileRenderer.java 这是我正在使用的实现.它将视频毫无问题地保存到文件中,但是使用编解码器后我只能在桌面上播放它,因为该文件是.y4m而不是.mp4;当我尝试使用VideoView播放它时,它说它无法播放视频,即使我尝试使用android随附的videoPlayer播放视频也无法播放,也只能使用MXPlayer,VLC或桌面上具有编解码器的任何其他应用程序播放视频.

为简化问题: How can I play video.y4m on native android VideoView?

我将进一步简化它,假设我不了解所记录文件的格式,这是我用来记录该文件的代码:

开始录制时:

remoteVideoFileRenderer = new VideoFileRenderer(
                fileToRecordTo.getAbsolutePath(),
                640,
                480,
                rootEglBase.getEglBaseContext());
        remoteVideoTrack.addSink(remoteVideoFileRenderer);

完成录制后:

remoteVideoFileRenderer.release();

再次提出问题:我有一个"fileToRecordTo",该视频文件可以在GOM(windows),VLC(windows,mac和Android),MXPlayer(Android)上播放,但我既不能使用内置于Android中的播放器(如果可以的话,我会在我的应用程序中使用过该播放器)或Android原生videoView上的播放器.

任何帮助.

解决方案

仅视频录制

我的项目中有一个类似的案例.最初,我尝试使用WebRTC的默认VideoFileRenderer,但视频大小太大,因为未应用压缩. 我找到了这个仓库.对于我而言,这确实有所帮助. https://github.com/cloudwebrtc/flutter-webrtc

这里是逐步指南.我也做了一些调整.

将此类添加到您的项目中.它有很多选项可以配置最终的视频格式.

import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.media.MediaMuxer;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;
import android.view.Surface;

import org.webrtc.EglBase;
import org.webrtc.GlRectDrawer;
import org.webrtc.VideoFrame;
import org.webrtc.VideoFrameDrawer;
import org.webrtc.VideoSink;

import java.io.IOException;
import java.nio.ByteBuffer;

class FileEncoder implements VideoSink {
    private static final String TAG = "FileRenderer";
    private final HandlerThread renderThread;
    private final Handler renderThreadHandler;
    private int outputFileWidth = -1;
    private int outputFileHeight = -1;
    private ByteBuffer[] encoderOutputBuffers;
    private EglBase eglBase;
    private EglBase.Context sharedContext;
    private VideoFrameDrawer frameDrawer;
    private static final String MIME_TYPE = "video/avc";    // H.264 Advanced Video Coding
    private static final int FRAME_RATE = 30;               // 30fps
    private static final int IFRAME_INTERVAL = 5;           // 5 seconds between I-frames
    private MediaMuxer mediaMuxer;
    private MediaCodec encoder;
    private MediaCodec.BufferInfo bufferInfo;
    private int trackIndex = -1;
    private boolean isRunning = true;
    private GlRectDrawer drawer;
    private Surface surface;

    FileEncoder(String outputFile, final EglBase.Context sharedContext) throws IOException {
        renderThread = new HandlerThread(TAG + "RenderThread");
        renderThread.start();
        renderThreadHandler = new Handler(renderThread.getLooper());
        bufferInfo = new MediaCodec.BufferInfo();
        this.sharedContext = sharedContext;

        mediaMuxer = new MediaMuxer(outputFile,
                MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
    }

    private void initVideoEncoder() {
        MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, 1280, 720);

        format.setInteger(MediaFormat.KEY_COLOR_FORMAT,
                MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
        format.setInteger(MediaFormat.KEY_BIT_RATE, 6000000);
        format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
        format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);

        try {
            encoder = MediaCodec.createEncoderByType(MIME_TYPE);
            encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
            renderThreadHandler.post(() -> {
                eglBase = EglBase.create(sharedContext, EglBase.CONFIG_RECORDABLE);
                surface = encoder.createInputSurface();
                eglBase.createSurface(surface);
                eglBase.makeCurrent();
                drawer = new GlRectDrawer();
            });
        } catch (Exception e) {
            Log.wtf(TAG, e);
        }
    }

    @Override
    public void onFrame(VideoFrame frame) {
        frame.retain();
        if (outputFileWidth == -1) {
            outputFileWidth = frame.getRotatedWidth();
            outputFileHeight = frame.getRotatedHeight();
            initVideoEncoder();
        }
        renderThreadHandler.post(() -> renderFrameOnRenderThread(frame));
    }

    private void renderFrameOnRenderThread(VideoFrame frame) {
        if (frameDrawer == null) {
            frameDrawer = new VideoFrameDrawer();
        }
        frameDrawer.drawFrame(frame, drawer, null, 0, 0, outputFileWidth, outputFileHeight);
        frame.release();
        drainEncoder();
        eglBase.swapBuffers();
    }

    /**
     * Release all resources. All already posted frames will be rendered first.
     */
    void release() {
        isRunning = false;
        renderThreadHandler.post(() -> {
            if (encoder != null) {
                encoder.stop();
                encoder.release();
            }
            eglBase.release();
            mediaMuxer.stop();
            mediaMuxer.release();
            renderThread.quit();
        });
    }

    private boolean encoderStarted = false;
    private volatile boolean muxerStarted = false;
    private long videoFrameStart = 0;

    private void drainEncoder() {
        if (!encoderStarted) {
            encoder.start();
            encoderOutputBuffers = encoder.getOutputBuffers();
            encoderStarted = true;
            return;
        }
        while (true) {
            int encoderStatus = encoder.dequeueOutputBuffer(bufferInfo, 10000);
            if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                break;
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                // not expected for an encoder
                encoderOutputBuffers = encoder.getOutputBuffers();
                Log.e(TAG, "encoder output buffers changed");
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                // not expected for an encoder
                MediaFormat newFormat = encoder.getOutputFormat();

                Log.e(TAG, "encoder output format changed: " + newFormat);
                trackIndex = mediaMuxer.addTrack(newFormat);
                if (!muxerStarted) {
                    mediaMuxer.start();
                    muxerStarted = true;
                }
                if (!muxerStarted)
                    break;
            } else if (encoderStatus < 0) {
                Log.e(TAG, "unexpected result fr om encoder.dequeueOutputBuffer: " + encoderStatus);
            } else { // encoderStatus >= 0
                try {
                    ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
                    if (encodedData == null) {
                        Log.e(TAG, "encoderOutputBuffer " + encoderStatus + " was null");
                        break;
                    }
                    // It's usually necessary to adjust the ByteBuffer values to match BufferInfo.
                    encodedData.position(bufferInfo.offset);
                    encodedData.limit(bufferInfo.offset + bufferInfo.size);
                    if (videoFrameStart == 0 && bufferInfo.presentationTimeUs != 0) {
                        videoFrameStart = bufferInfo.presentationTimeUs;
                    }
                    bufferInfo.presentationTimeUs -= videoFrameStart;
                    if (muxerStarted)
                        mediaMuxer.writeSampleData(trackIndex, encodedData, bufferInfo);
                    isRunning = isRunning && (bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) == 0;
                    encoder.releaseOutputBuffer(encoderStatus, false);
                    if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                        break;
                    }
                } catch (Exception e) {
                    Log.wtf(TAG, e);
                    break;
                }
            }
        }
    }

    private long presTime = 0L;

}

现在在您的Activity/Fragment类上

声明上述类的变量

FileEncoder recording;

收到要录制的流(远程或本地)时,可以初始化录制.

FileEncoder recording = new FileEncoder("path/to/video", rootEglBase.eglBaseContext)
remoteVideoTrack.addSink(recording)

通话结束后,您需要停止并释放录音.

remoteVideoTrack.removeSink(recording)
recording.release()

这足以录制视频但不录制音频.

视频和视频;录音

要录制本地对等方的音频,您需要使用此类( https://webrtc.googlesource.com/src/+/master/examples/androidapp/src/org/appspot/apprtc/RecordedAudioToFileController.java ).但是首先您需要设置一个AudioDeviceModule对象

AudioDeviceModule adm = createJavaAudioDevice()
peerConnectionFactory = PeerConnectionFactory.builder()
    .setOptions(options)
    .setAudioDeviceModule(adm)
    .setVideoEncoderFactory(defaultVideoEncoderFactory)
    .setVideoDecoderFactory(defaultVideoDecoderFactory)
    .createPeerConnectionFactory()
adm.release()

private AudioDeviceModule createJavaAudioDevice() {
    //Implement AudioRecordErrorCallback
    //Implement AudioTrackErrorCallback
 return JavaAudioDeviceModule.builder(this)
    .setSamplesReadyCallback(audioRecorder)
    //Default audio source is Voice Communication which is good for VoIP sessions. You can change to the audio source you want.
    .setAudioSource(MediaRecorder.AudioSource.VOICE_COMMUNICATION)
    .setAudioRecordErrorCallback(audioRecordErrorCallback)
    .setAudioTrackErrorCallback(audioTrackErrorCallback)
    .createAudioDeviceModule()

}

合并音频和视频

添加此依赖项

implementation 'com.googlecode.mp4parser:isoparser:1.1.22'

然后在通话结束时将此片段添加到您的代码中.确保视频和音频录制已停止并正确释放.

try {
     Movie video;
     video = MovieCreator.build("path/to/recorded/video");
     Movie audio;
     audio = MovieCreator.build("path/to/recorded/audio");
     Track audioTrack = audio.getTracks().get(0)  
     video.addTrack(audioTrack);
     Container out = new DefaultMp4Builder().build(video);
     FileChannel fc = new FileOutputStream(new File("path/to/final/output")).getChannel();
     out.writeContainer(fc);
     fc.close();
} catch (IOException e) {
     e.printStackTrace();
}

我知道这不是在Android WebRTC视频通话中录制音频和视频的最佳解决方案.如果有人知道如何使用WebRTC提取音频,请添加注释.

I am developing a webrtc based video chat app, currently the video call is working, but I want to record a video from the remote video stream using the VideoFileRenderer, there are many implementations of the interface for example: https://chromium.googlesource.com/external/webrtc/+/master/sdk/android/api/org/webrtc/VideoFileRenderer.java this is the implementation I am using. It saves the video to the file with no problem but I can only play it with desktop after using a codec because the file is .y4m not .mp4 and when I try to play it using VideoView it says that it can't play the video, even if I try to play the video with the videoPlayer that comes with the android it can't play it, I can only play it using MXPlayer, VLC, or any other application that has codecs in desktop.

to simplify the question: How can I play video.y4m on native android VideoView?

I will simplify it more, I will assume that I don't understand the format of the recorded file, here is the code I am using to record the file:

When start recording:

remoteVideoFileRenderer = new VideoFileRenderer(
                fileToRecordTo.getAbsolutePath(),
                640,
                480,
                rootEglBase.getEglBaseContext());
        remoteVideoTrack.addSink(remoteVideoFileRenderer);

When finish recording:

remoteVideoFileRenderer.release();

Now the question again: I have a "fileToRecordTo" and this video file can be played on GOM(windows), VLC(windows, mac and Android), MXPlayer(Android) but I can't neither play it using the player that comes embedded with the Android(if worked, I would have used this player in my app) nor on Android native videoView.

any help.

解决方案

Video only recording

I had a similar case in my project. At first, I tried WebRTC's default VideoFileRenderer but the video size was too big because no compression is applied. I found this repository. It really helped in my case. https://github.com/cloudwebrtc/flutter-webrtc

Here is a step by step guide. I've also made some adjustments.

Add this class to your project. It has lots of options to configure the final video format.

import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.media.MediaMuxer;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;
import android.view.Surface;

import org.webrtc.EglBase;
import org.webrtc.GlRectDrawer;
import org.webrtc.VideoFrame;
import org.webrtc.VideoFrameDrawer;
import org.webrtc.VideoSink;

import java.io.IOException;
import java.nio.ByteBuffer;

class FileEncoder implements VideoSink {
    private static final String TAG = "FileRenderer";
    private final HandlerThread renderThread;
    private final Handler renderThreadHandler;
    private int outputFileWidth = -1;
    private int outputFileHeight = -1;
    private ByteBuffer[] encoderOutputBuffers;
    private EglBase eglBase;
    private EglBase.Context sharedContext;
    private VideoFrameDrawer frameDrawer;
    private static final String MIME_TYPE = "video/avc";    // H.264 Advanced Video Coding
    private static final int FRAME_RATE = 30;               // 30fps
    private static final int IFRAME_INTERVAL = 5;           // 5 seconds between I-frames
    private MediaMuxer mediaMuxer;
    private MediaCodec encoder;
    private MediaCodec.BufferInfo bufferInfo;
    private int trackIndex = -1;
    private boolean isRunning = true;
    private GlRectDrawer drawer;
    private Surface surface;

    FileEncoder(String outputFile, final EglBase.Context sharedContext) throws IOException {
        renderThread = new HandlerThread(TAG + "RenderThread");
        renderThread.start();
        renderThreadHandler = new Handler(renderThread.getLooper());
        bufferInfo = new MediaCodec.BufferInfo();
        this.sharedContext = sharedContext;

        mediaMuxer = new MediaMuxer(outputFile,
                MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
    }

    private void initVideoEncoder() {
        MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, 1280, 720);

        format.setInteger(MediaFormat.KEY_COLOR_FORMAT,
                MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
        format.setInteger(MediaFormat.KEY_BIT_RATE, 6000000);
        format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
        format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);

        try {
            encoder = MediaCodec.createEncoderByType(MIME_TYPE);
            encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
            renderThreadHandler.post(() -> {
                eglBase = EglBase.create(sharedContext, EglBase.CONFIG_RECORDABLE);
                surface = encoder.createInputSurface();
                eglBase.createSurface(surface);
                eglBase.makeCurrent();
                drawer = new GlRectDrawer();
            });
        } catch (Exception e) {
            Log.wtf(TAG, e);
        }
    }

    @Override
    public void onFrame(VideoFrame frame) {
        frame.retain();
        if (outputFileWidth == -1) {
            outputFileWidth = frame.getRotatedWidth();
            outputFileHeight = frame.getRotatedHeight();
            initVideoEncoder();
        }
        renderThreadHandler.post(() -> renderFrameOnRenderThread(frame));
    }

    private void renderFrameOnRenderThread(VideoFrame frame) {
        if (frameDrawer == null) {
            frameDrawer = new VideoFrameDrawer();
        }
        frameDrawer.drawFrame(frame, drawer, null, 0, 0, outputFileWidth, outputFileHeight);
        frame.release();
        drainEncoder();
        eglBase.swapBuffers();
    }

    /**
     * Release all resources. All already posted frames will be rendered first.
     */
    void release() {
        isRunning = false;
        renderThreadHandler.post(() -> {
            if (encoder != null) {
                encoder.stop();
                encoder.release();
            }
            eglBase.release();
            mediaMuxer.stop();
            mediaMuxer.release();
            renderThread.quit();
        });
    }

    private boolean encoderStarted = false;
    private volatile boolean muxerStarted = false;
    private long videoFrameStart = 0;

    private void drainEncoder() {
        if (!encoderStarted) {
            encoder.start();
            encoderOutputBuffers = encoder.getOutputBuffers();
            encoderStarted = true;
            return;
        }
        while (true) {
            int encoderStatus = encoder.dequeueOutputBuffer(bufferInfo, 10000);
            if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                break;
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                // not expected for an encoder
                encoderOutputBuffers = encoder.getOutputBuffers();
                Log.e(TAG, "encoder output buffers changed");
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                // not expected for an encoder
                MediaFormat newFormat = encoder.getOutputFormat();

                Log.e(TAG, "encoder output format changed: " + newFormat);
                trackIndex = mediaMuxer.addTrack(newFormat);
                if (!muxerStarted) {
                    mediaMuxer.start();
                    muxerStarted = true;
                }
                if (!muxerStarted)
                    break;
            } else if (encoderStatus < 0) {
                Log.e(TAG, "unexpected result fr om encoder.dequeueOutputBuffer: " + encoderStatus);
            } else { // encoderStatus >= 0
                try {
                    ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
                    if (encodedData == null) {
                        Log.e(TAG, "encoderOutputBuffer " + encoderStatus + " was null");
                        break;
                    }
                    // It's usually necessary to adjust the ByteBuffer values to match BufferInfo.
                    encodedData.position(bufferInfo.offset);
                    encodedData.limit(bufferInfo.offset + bufferInfo.size);
                    if (videoFrameStart == 0 && bufferInfo.presentationTimeUs != 0) {
                        videoFrameStart = bufferInfo.presentationTimeUs;
                    }
                    bufferInfo.presentationTimeUs -= videoFrameStart;
                    if (muxerStarted)
                        mediaMuxer.writeSampleData(trackIndex, encodedData, bufferInfo);
                    isRunning = isRunning && (bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) == 0;
                    encoder.releaseOutputBuffer(encoderStatus, false);
                    if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                        break;
                    }
                } catch (Exception e) {
                    Log.wtf(TAG, e);
                    break;
                }
            }
        }
    }

    private long presTime = 0L;

}

Now on your Activity/Fragment class

Declare a variable of the above class

FileEncoder recording;

When you receive the stream you want to record(remote or local) you can initialize the recording.

FileEncoder recording = new FileEncoder("path/to/video", rootEglBase.eglBaseContext)
remoteVideoTrack.addSink(recording)

When the call session ends, you need to stop and release the recording.

remoteVideoTrack.removeSink(recording)
recording.release()

This is enough to record the video but without audio.

Video & Audio recording

To record local peer's audio you need to consume this class(https://webrtc.googlesource.com/src/+/master/examples/androidapp/src/org/appspot/apprtc/RecordedAudioToFileController.java). But first you need to setup an AudioDeviceModule object

AudioDeviceModule adm = createJavaAudioDevice()
peerConnectionFactory = PeerConnectionFactory.builder()
    .setOptions(options)
    .setAudioDeviceModule(adm)
    .setVideoEncoderFactory(defaultVideoEncoderFactory)
    .setVideoDecoderFactory(defaultVideoDecoderFactory)
    .createPeerConnectionFactory()
adm.release()

private AudioDeviceModule createJavaAudioDevice() {
    //Implement AudioRecordErrorCallback
    //Implement AudioTrackErrorCallback
 return JavaAudioDeviceModule.builder(this)
    .setSamplesReadyCallback(audioRecorder)
    //Default audio source is Voice Communication which is good for VoIP sessions. You can change to the audio source you want.
    .setAudioSource(MediaRecorder.AudioSource.VOICE_COMMUNICATION)
    .setAudioRecordErrorCallback(audioRecordErrorCallback)
    .setAudioTrackErrorCallback(audioTrackErrorCallback)
    .createAudioDeviceModule()

}

Merge audio and video

Add this dependency

implementation 'com.googlecode.mp4parser:isoparser:1.1.22'

Then add this piece to your code when your call finishes. Make sure that video and audio recording are stopped and released properly.

try {
     Movie video;
     video = MovieCreator.build("path/to/recorded/video");
     Movie audio;
     audio = MovieCreator.build("path/to/recorded/audio");
     Track audioTrack = audio.getTracks().get(0)  
     video.addTrack(audioTrack);
     Container out = new DefaultMp4Builder().build(video);
     FileChannel fc = new FileOutputStream(new File("path/to/final/output")).getChannel();
     out.writeContainer(fc);
     fc.close();
} catch (IOException e) {
     e.printStackTrace();
}

I know this isn't the best solution for recording audio and video in an Android WebRTC video call. If someone knows how to extract audio using WebRTC please add a comment.

这篇关于在Android中使用WebRTC保存视频文件时遇到问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆