Android Webrtc 从来自其他对等方的流中录制视频 [英] Android Webrtc record video from the stream coming from the other peer

查看:49
本文介绍了Android Webrtc 从来自其他对等方的流中录制视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在开发一个 webrtc 视频通话 Android 应用程序,它工作得很好,我需要录制另一个对等点 (remoteVideoStream) 和 myStream (localVideoStream) 的视频并将其转换为一些mp4 或任何其他格式的可保存格式,我确实搜索过,但无法弄清楚如何完成这项工作.

I am developing a webrtc video call Android app, it is working more than fine, I need to record the video of the other peer (remoteVideoStream) and myStream (localVideoStream) and convert it to some saveable format like mp4 or any other format, I really searched for that, but without being able to figure out how to do the job.

我已经阅读了有关 VideoFileRenderer 的内容,我尝试将其添加到我的代码中以保存视频,但也无法使用它,它没有任何方法,例如 record() 或 save(),尽管它有一个方法称为release() 将用于结束保存视频.如果有人有任何想法,这是课程:

I have read about VideoFileRenderer, I tried to add it to my code to save the video but could not use it as well it has not any method called for example record() or save(), although it has a method called release() which will be used to end saving the video. Here is the class if any one has any idea:

@JNINamespace("webrtc::jni")
public class VideoFileRenderer implements Callbacks, VideoSink {
private static final String TAG = "VideoFileRenderer";
private final HandlerThread renderThread;
private final Handler renderThreadHandler;
private final FileOutputStream videoOutFile;
private final String outputFileName;
private final int outputFileWidth;
private final int outputFileHeight;
private final int outputFrameSize;
private final ByteBuffer outputFrameBuffer;
private EglBase eglBase;
private YuvConverter yuvConverter;
private ArrayList<ByteBuffer> rawFrames = new ArrayList();

public VideoFileRenderer(String outputFile, int outputFileWidth, int outputFileHeight, final Context sharedContext) throws IOException {
    if (outputFileWidth % 2 != 1 && outputFileHeight % 2 != 1) {
        this.outputFileName = outputFile;
        this.outputFileWidth = outputFileWidth;
        this.outputFileHeight = outputFileHeight;
        this.outputFrameSize = outputFileWidth * outputFileHeight * 3 / 2;
        this.outputFrameBuffer = ByteBuffer.allocateDirect(this.outputFrameSize);
        this.videoOutFile = new FileOutputStream(outputFile);
        this.videoOutFile.write(("YUV4MPEG2 C420 W" + outputFileWidth + " H" + outputFileHeight + " Ip F30:1 A1:1\n").getBytes(Charset.forName("US-ASCII")));
        this.renderThread = new HandlerThread("VideoFileRenderer");
        this.renderThread.start();
        this.renderThreadHandler = new Handler(this.renderThread.getLooper());
        ThreadUtils.invokeAtFrontUninterruptibly(this.renderThreadHandler, new Runnable() {
            public void run() {
                VideoFileRenderer.this.eglBase = EglBase.create(sharedContext, EglBase.CONFIG_PIXEL_BUFFER);
                VideoFileRenderer.this.eglBase.createDummyPbufferSurface();
                VideoFileRenderer.this.eglBase.makeCurrent();
                VideoFileRenderer.this.yuvConverter = new YuvConverter();
            }
        });
    } else {
        throw new IllegalArgumentException("Does not support uneven width or height");
    }
}

public void renderFrame(I420Frame i420Frame) {
    VideoFrame frame = i420Frame.toVideoFrame();
    this.onFrame(frame);
    frame.release();
}

public void onFrame(VideoFrame frame) {
    frame.retain();
    this.renderThreadHandler.post(() -> {
        this.renderFrameOnRenderThread(frame);
    });
}

private void renderFrameOnRenderThread(VideoFrame frame) {
    Buffer buffer = frame.getBuffer();
    int targetWidth = frame.getRotation() % 180 == 0 ? this.outputFileWidth : this.outputFileHeight;
    int targetHeight = frame.getRotation() % 180 == 0 ? this.outputFileHeight : this.outputFileWidth;
    float frameAspectRatio = (float)buffer.getWidth() / (float)buffer.getHeight();
    float fileAspectRatio = (float)targetWidth / (float)targetHeight;
    int cropWidth = buffer.getWidth();
    int cropHeight = buffer.getHeight();
    if (fileAspectRatio > frameAspectRatio) {
        cropHeight = (int)((float)cropHeight * (frameAspectRatio / fileAspectRatio));
    } else {
        cropWidth = (int)((float)cropWidth * (fileAspectRatio / frameAspectRatio));
    }

    int cropX = (buffer.getWidth() - cropWidth) / 2;
    int cropY = (buffer.getHeight() - cropHeight) / 2;
    Buffer scaledBuffer = buffer.cropAndScale(cropX, cropY, cropWidth, cropHeight, targetWidth, targetHeight);
    frame.release();
    I420Buffer i420 = scaledBuffer.toI420();
    scaledBuffer.release();
    ByteBuffer byteBuffer = JniCommon.nativeAllocateByteBuffer(this.outputFrameSize);
    YuvHelper.I420Rotate(i420.getDataY(), i420.getStrideY(), i420.getDataU(), i420.getStrideU(), i420.getDataV(), i420.getStrideV(), byteBuffer, i420.getWidth(), i420.getHeight(), frame.getRotation());
    i420.release();
    byteBuffer.rewind();
    this.rawFrames.add(byteBuffer);
}

public void release() {
    CountDownLatch cleanupBarrier = new CountDownLatch(1);
    this.renderThreadHandler.post(() -> {
        this.yuvConverter.release();
        this.eglBase.release();
        this.renderThread.quit();
        cleanupBarrier.countDown();
    });
    ThreadUtils.awaitUninterruptibly(cleanupBarrier);

    try {
        Iterator var2 = this.rawFrames.iterator();

        while(var2.hasNext()) {
            ByteBuffer buffer = (ByteBuffer)var2.next();
            this.videoOutFile.write("FRAME\n".getBytes(Charset.forName("US-ASCII")));
            byte[] data = new byte[this.outputFrameSize];
            buffer.get(data);
            this.videoOutFile.write(data);
            JniCommon.nativeFreeByteBuffer(buffer);
        }

        this.videoOutFile.close();
        Logging.d("VideoFileRenderer", "Video written to disk as " + this.outputFileName + ". Number frames are " + this.rawFrames.size() + " and the dimension of the frames are " + this.outputFileWidth + "x" + this.outputFileHeight + ".");
    } catch (IOException var5) {
        Logging.e("VideoFileRenderer", "Error writing video to disk", var5);
    }

}

}

我找不到任何有用的方法可以提供帮助.

I can't find any helpful method that can help.

推荐答案

VideoFileRenderer 类只是演示了如何访问远程/本地对等方的解码原始视频帧.这不是在录制有效的视频文件.
您应该手动实现将原始视频帧编码和混合到容器中的逻辑,例如 mp4.

VideoFileRenderer class just demonstrates how you can access to decoded raw video frames for remote/local peer. This is not recording valid video file.
You should implement manually the logic of encoding and muxing raw video frames into container, like mp4.

主要流程如下:

  • 切换到最新的 webrtc 版本(目前为 v.1.0.25331)
  • 创建视频容器.例如,请参阅 Android SDK 中的 MediaMuxer
  • 实现用于从特定视频源获取原始帧的 VideoSink 接口.例如,请参阅 apprtc/CallActivity.java 类 ProxyVideoSink
  • 使用 MediaCodec 对每一帧进行编码并写入视频容器
  • 完成多路复用器
  • Switch to the latest webrtc version (v.1.0.25331 for now)
  • Create video container. For example see MediaMuxer class from Android SDK
  • Implement interface VideoSink for obtaining raw frames from certain video source. For example see apprtc/CallActivity.java class ProxyVideoSink
  • Encode every frame using MediaCodec and write to video container
  • Finalize muxer

这篇关于Android Webrtc 从来自其他对等方的流中录制视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆