混合现实WebRTC-使用GraphicsCapturePicker进行屏幕捕获 [英] Mixed Reality WebRTC - Screen capturing with GraphicsCapturePicker

查看:393
本文介绍了混合现实WebRTC-使用GraphicsCapturePicker进行屏幕捕获的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

设置
嘿,
我正在尝试捕获屏幕并通过 MR-WebRTC 发送/通信流.两台PC或具有HoloLens的PC之间的通信对我来说都可以使用网络摄像头,因此我认为下一步可能是在屏幕上播放视频.因此,我使用了已经拥有的uwp应用程序,该应用程序与我的网络摄像头配合使用,并试图使其正常工作:

Setup
Hey,
I'm trying to capture my screen and send/communicate the stream via MR-WebRTC. Communication between two PCs or PC with HoloLens worked with webcams for me, so I thought the next step could be streaming my screen. So I took the uwp application that I already had, which worked with my webcam and tried to make things work:

  • UWP App is based on the example uwp app from MR-WebRTC.
  • For Capturing I'm using the instruction from MS about screen capturing via GraphicsCapturePicker.

所以现在我处于以下情况:

  1. 我从屏幕捕获中获得了一个框架,但其类型为
  1. I get a frame from the screen capturing, but its type is Direct3D11CaptureFrame. You can see it below in the code snipped.
  2. MR-WebRTC takes a frame type I420AVideoFrame (also in a code snipped).

如何连接"他们吗?

  • I420AVideoFrame wants a frame in the I420A format (YUV 4:2:0).
  • Configuring the framePool I can set the DirectXPixelFormat, but it has no YUV420.
  • I found this post on so, saying that it its possible.

Direct3D的代码截取帧:

_framePool = Direct3D11CaptureFramePool.Create(
                _canvasDevice,                             // D3D device
                DirectXPixelFormat.B8G8R8A8UIntNormalized, // Pixel format
                3,                                         // Number of frames
                _item.Size);                               // Size of the buffers

_session = _framePool.CreateCaptureSession(_item);
_session.StartCapture();
_framePool.FrameArrived += (s, a) =>
{
    using (var frame = _framePool.TryGetNextFrame())
    {
        // Here I would take the Frame and call the MR-WebRTC method LocalI420AFrameReady  
    }
};

来自WebRTC的

代码段框架:

// This is the way with the webcam; so LocalI420 was subscribed to
// the event I420AVideoFrameReady and got the frame from there
_webcamSource = await DeviceVideoTrackSource.CreateAsync();
_webcamSource.I420AVideoFrameReady += LocalI420AFrameReady;

// enqueueing the newly captured video frames into the bridge,
// which will later deliver them when the Media Foundation
// playback pipeline requests them.
private void LocalI420AFrameReady(I420AVideoFrame frame)
    {
        lock (_localVideoLock)
        {
            if (!_localVideoPlaying)
            {
                _localVideoPlaying = true;

                // Capture the resolution into local variable useable from the lambda below
                uint width = frame.width;
                uint height = frame.height;

                // Defer UI-related work to the main UI thread
                RunOnMainThread(() =>
                {
                    // Bridge the local video track with the local media player UI
                    int framerate = 30; // assumed, for lack of an actual value
                    _localVideoSource = CreateI420VideoStreamSource(
                        width, height, framerate);
                    var localVideoPlayer = new MediaPlayer();
                    localVideoPlayer.Source = MediaSource.CreateFromMediaStreamSource(
                        _localVideoSource);
                    localVideoPlayerElement.SetMediaPlayer(localVideoPlayer);
                    localVideoPlayer.Play();
                });
            }
        }
        // Enqueue the incoming frame into the video bridge; the media player will
        // later dequeue it as soon as it's ready.
        _localVideoBridge.HandleIncomingVideoFrame(frame);
    }

推荐答案

我通过在github存储库上创建问题来找到解决问题的方法.答案./github.com/KarthikRichie"rel =" nofollow noreferrer> KarthikRichie :

I found a solution for my problem by creating an issue on the github repo. Answer was provided by KarthikRichie:

  1. 您必须使用 ExternalVideoTrackSource
  2. 您可以将 Direct3D11CaptureFrame 转换为 Argb32VideoFrame

// Setting up external video track source
_screenshareSource = ExternalVideoTrackSource.CreateFromArgb32Callback(FrameCallback);

struct WebRTCFrameData
{
    public IntPtr Data;
    public uint Height;
    public uint Width;
    public int Stride;
}

public void FrameCallback(in FrameRequest frameRequest)
{
    try
    {
        if (FramePool != null)
        {
            using (Direct3D11CaptureFrame _currentFrame = FramePool.TryGetNextFrame())
            {
                if (_currentFrame != null)
                {
                    WebRTCFrameData webRTCFrameData = ProcessBitmap(_currentFrame.Surface).Result;
                    frameRequest.CompleteRequest(new Argb32VideoFrame()
                    {
                        data = webRTCFrameData.Data,
                        height = webRTCFrameData.Height,
                        width = webRTCFrameData.Width,
                        stride = webRTCFrameData.Stride
                    });
                }
            }
        }
    }
    catch (Exception ex)
    {
    }
}

private async Task<WebRTCFrameData> ProcessBitmap(IDirect3DSurface surface)
{

    SoftwareBitmap softwareBitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(surface, Windows.Graphics.Imaging.BitmapAlphaMode.Straight);

    byte[] imageBytes = new byte[4 * softwareBitmap.PixelWidth * softwareBitmap.PixelHeight];
    softwareBitmap.CopyToBuffer(imageBytes.AsBuffer());
    WebRTCFrameData argb32VideoFrame = new WebRTCFrameData();
    argb32VideoFrame.Data = GetByteIntPtr(imageBytes);
    argb32VideoFrame.Height = (uint)softwareBitmap.PixelHeight;
    argb32VideoFrame.Width = (uint)softwareBitmap.PixelWidth;

    var test = softwareBitmap.LockBuffer(BitmapBufferAccessMode.Read);
    int count = test.GetPlaneCount();
    var pl = test.GetPlaneDescription(count - 1);
    argb32VideoFrame.Stride = pl.Stride;

    return argb32VideoFrame;

}

private IntPtr GetByteIntPtr(byte[] byteArr)
{
    IntPtr intPtr2 = System.Runtime.InteropServices.Marshal.UnsafeAddrOfPinnedArrayElement(byteArr, 0);
    return intPtr2;
}

这篇关于混合现实WebRTC-使用GraphicsCapturePicker进行屏幕捕获的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆