适用于Android的WebRTC的自定义视频源 [英] Custom video source for WebRTC on Android

查看:265
本文介绍了适用于Android的WebRTC的自定义视频源的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想使用自定义视频源通过WebRTC Android实现对视频流进行实时直播.如果我理解正确,那么现有的实现仅支持Android手机上的前置和后置摄像头.在这种情况下,以下类是相关的:

I would like to use a custom video source to live stream video via WebRTC Android implementation. If I understand correctly, existing implementation only supports front and back facing cameras on Android phones. The following classes are relevant in this scenario:

  • Camera1Enumerator.java
  • VideoCapturer.java
  • PeerConnectionFactory
  • VideoSource.java
  • VideoTrack.java

当前要在Android手机上使用前置摄像头,我正在执行以下步骤:

Currently for using front facing camera on Android phone I'm doing the following steps:

CameraEnumerator enumerator = new Camera1Enumerator(false);
VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);
VideoSource videoSource = peerConnectionFactory.createVideoSource(false);
videoCapturer.initialize(surfaceTextureHelper, this.getApplicationContext(), videoSource.getCapturerObserver());
VideoTrack localVideoTrack = peerConnectionFactory.createVideoTrack(VideoTrackID, videoSource);

我的情况

我有一个回调处理程序,用于从自定义视频源接收字节数组中的视频缓冲区:

My scenario

I've a callback handler that receives video buffer in byte array from custom video source:

public void onReceive(byte[] videoBuffer, int size) {}

如何发送此字节数组缓冲区?我不确定该解决方案,但我想我必须实现自定义VideoCapturer?

How would I be able to send this byte array buffer? I'm not sure about the solution, but I think I would have to implement custom VideoCapturer?

此问题可能是相关的,尽管我没有使用libjingle库,而仅使用本地WebRTC Android软件包.

This question might be relevant, though I'm not using libjingle library, only native WebRTC Android package.

类似的问题/文章

  • for iOS platform but unfortunately I couldn't help with the answers.
  • for native C++ platform
  • article about native implementation

推荐答案

此问题有两种可能的解决方案:

There are two possible solutions to this problem:

  1. 实施自定义VideoCapturer,并在onReceive处理程序中使用byte[]流数据创建VideoFrame.实际上存在一个
  1. Implement custom VideoCapturer and create VideoFrame using byte[] stream data in onReceive handler. There actually exists a very good example of FileVideoCapturer, which implements VideoCapturer.
  2. Simply construct VideoFrame from NV21Buffer, which is created from our byte array stream data. Then we only need to use our previously created VideoSource to capture this frame. Example:

public void onReceive(byte[] videoBuffer, int size, int width, int height) {
    long timestampNS = TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());
    NV21Buffer buffer = new NV21Buffer(videoBuffer, width, height, null);

    VideoFrame videoFrame = new VideoFrame(buffer, 0, timestampNS);
    videoSource.getCapturerObserver().onFrameCaptured(videoFrame);

    videoFrame.release();
}

这篇关于适用于Android的WebRTC的自定义视频源的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆