Android WebRTC中的本地视频渲染器 [英] Local Video Renderer in Android WebRTC

查看:219
本文介绍了Android WebRTC中的本地视频渲染器的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用此库: https://bintray.com/google/webrtc/google-webrtc

我想要实现的(至少在我的项目开始时)是在本地渲染视频.我正在使用本教程(这是Internet上唯一的教程) https://vivekc.xyz/getting-started-with-webrtc-for-android-daab1e268ff4 .不幸的是,最后一行代码已不再是最新的.构造函数需要一个我不知道如何实现的回调:

What I want to achieve (at least, at the beginning of my project) is render video locally. I am using this tutorial (which is the only one around the Internet) https://vivekc.xyz/getting-started-with-webrtc-for-android-daab1e268ff4. Unfortunately, the last line of code is not up-to-date anymore. The constructor needs a callback which I have no idea how to implement:

localVideoTrack.addRenderer(new VideoRenderer(i420Frame-> {//不知道在这里放什么}));

我的代码与发布的教程中的代码完全相同.这是熟悉Android中 WebRTC 技术的第一步,我不知道.我的相机正在捕获视频,因为我可以在日志中看到它:

My code is exactly the same as in the posted tutorial. This is the very first step to make familiar with WebRTC technology in Android which I cannot figure out. My camera is capturing the video because I can see it in my log:

I/org.webrtc.登录:CameraStatistics:Camera fps:28.

主要问题是我不知道如何通过回调将其传递给我的 SurfaceViewRenderer .有没有人遇到这个问题?我将非常感谢您的任何帮助或建议.

The main issue is that I have no idea how to pass it to my SurfaceViewRenderer through a callback. Did anyone meet that problem? I'll really appreciate any help or suggestions.

这是官方的示例应用程序,它是唯一的来源,但其完成方式与本教程中的应用程序不同,它要复杂得多: https://webrtc.googlesource.com/src/+/master/examples/androidapp/src/org/appspot/apprtc

Here is the official example app which is the only source but it is done differently than one in the tutorial, it's much more complicated: https://webrtc.googlesource.com/src/+/master/examples/androidapp/src/org/appspot/apprtc

推荐答案

您是对的,该API不再与本教程中的API相匹配,但是已经接近.

You are right, the API no longer matches that in the tutorial, but it's close.

VideoTrack具有 addRenderer(VideoRenderer渲染器)方法,该方法要求您创建一个以SurfaceViewRenderer作为参数的VideoRenderer.但这不可能了,因此您应该使用VideoTrack的 addSink(VideoSink sink)方法.SurfaceViewRenderer对象实现VideoSink onFrame(VideoFrame frame)方法来使这项工作有效.

The VideoTrack, has an addRenderer(VideoRenderer renderer) method, that requires you to create a VideoRenderer, with the SurfaceViewRenderer as parameter. But that is not possible anymore, so instead you should use the addSink(VideoSink sink) method, of the VideoTrack. The SurfaceViewRenderer object implement the VideoSink onFrame(VideoFrame frame) method to make this work.

VideoTrack videoTrack = utility.createVideoTrack();
videoTrack.addSink(this.localSurfaceViewRenderer);

我使用相同的官方示例应用作为参考来得出这个结论,对我来说很好.

I used the same official example app as reference to get to this conclusion, and it works fine for me.

这篇关于Android WebRTC中的本地视频渲染器的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆