有没有办法使用 getUserMedia 减少延迟? [英] Is there a way to reduce latency using getUserMedia?
问题描述
在尝试减少 WebRTC 通信的视频延迟时,我测量了视频捕获和视频显示之间的延迟.
While trying to reduce the video latency for a WebRTC communication, I measured the delay between the video capture and the video display.
为了防止测量 WebRTC 涉及的延迟,我只使用了 getUserMedia
和显示流的 HTML video
.
To prevent measuring latency involved by WebRTC, I just used getUserMedia
and an HTML video
that displayed the stream.
我通过每帧显示时间戳(使用 requestAnimationFrame
)、用 USB 摄像头记录我的屏幕并截取视频显示和显示的时间戳都可见的屏幕截图来做到这一点.
I did it by displaying a timestamp every frame (using requestAnimationFrame
), recording my screen with a USB camera and taking screenshots where both the video display and the displayed timestamp where visible.
平均而言,我测得的延迟为 ~150 毫秒.这一定是高估了(由于调用之间的 requestAnimationFrame
时间),但是我所做的最小测量是大约 120 毫秒,这仍然很多.
On average, I measured a delay of ~150ms.
This must be an overestimation (due to requestAnimationFrame
time between calls), however the minimum measure I made is around 120ms, that still a lot.
现在,有没有办法减少视频捕获和视频显示之间的延迟?
注意:
- 我尝试使用另一个视频播放器(窗口的内置播放器),结果非常接近(平均延迟约 145 毫秒)
- 我尝试了另一种视频设备(我的带镜子的笔记本电脑网络摄像头),我认为结果不太接近但仍然升高(平均延迟约 120 毫秒)
推荐答案
一般来说,这只能在浏览器中修复.
In general this is something you can only fix in the browser itself.
requestVideoFrameCallback API 正在收集一些数字,例如 captureTime 和 renderTime.https://web.dev/requestvideoframecallback-rvfc/ 有一个很好的描述,https://webrtc.github.io/samples/src/content/peerconnection/per-frame-callback/ 将它们可视化.
The requestVideoFrameCallback API is gathering some numbers such as captureTime and renderTime. https://web.dev/requestvideoframecallback-rvfc/ has a pretty good description, https://webrtc.github.io/samples/src/content/peerconnection/per-frame-callback/ visualizes them.
这篇关于有没有办法使用 getUserMedia 减少延迟?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!