UWP-将WebCam通过套接字流式传输到MediaElement-图像破裂? [英] UWP - Streaming WebCam over Socket to MediaElement - Broken Picture?

查看:39
本文介绍了UWP-将WebCam通过套接字流式传输到MediaElement-图像破裂?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

背景

Background

我编写的代码记录了来自网络摄像机的视频剪辑,将其剪辑到内存流中,然后通过Socket连接传输数据,然后将其重新组合为视频并在Media Element上播放.

The code I've written records video clips from a webcam, writes them to a memory stream, then transmits the data across a Socket connection where it's re-assembled into video and played back on a Media Element.

最终目标是创建一个婴儿监护系统,其服务器/摄像机运行在Windows IOT Raspberry Pi上,并创建一个UWP应用,我和我的女友可以在我们的手机或笔记本电脑上查看该应用.除了从房子的另一部分观看摄像机外,我们还可以在我们中一个人不在家时登录,然后及时连接PIR运动传感器和警报系统,但首先首先.

The ultimate goal is to create a baby monitor system, with the server/camera running on a Windows IOT Raspberry Pi, and a UWP app that my girlfriend and I can view on our mobile phones, or the laptop. As well as viewing the camera from another part of the house, we'll also be able to log in when one of us is away from home and in time I'll wire up a PIR motion sensor and alerting system also, but first things first.

整个系统运行良好,视频中有5秒钟的延迟,这是我可以接受的(目前),并且使用MediaPlaybackList,视频以相当恒定的速率进行无缝传输(就像这样无缝传输).可以立即获得)视频之间的过渡.MediaPlaybackList会在播放项目时将其删除,从而使内存占用量保持相对恒定.

The system as a whole work fairly well, there's a 5 second delay in the video which is acceptable to me (for now), and using a MediaPlaybackList the video is streamed at a fairly constant rate with seamless (as seamless as this can get for now) transition between videos. The MediaPlaybackList removes items as they've been played, keeping the memory footprint to a relative constant.

问题

The Issue

当视频在客户端上播放时,它会出现频繁但随机的断片部分.它没有任何图案,无论如何我都找不到,并且我能描述的唯一方法是将图片的一部分水平分成两半,将两半互换,图片的右侧显示在左边,反之亦然.就像闪烁一样,断点只显示了几分之一秒,因为另一位出现在图片上其他位置的时间大约是一秒钟.

When the video plays back on the client end, it gets frequent but random sections of broken picture. There's no pattern to it, not one that I can find anyway, and the only way I can describe it is that part of the picture is split in half horizontally and the two halves are swapped around, the right side of the picture being displayed on the left, and vice versa. It's like a flicker, as in the broken bit is only displayed for a fraction of a second, because another one appears a second or so later somewhere else on the picture.

这是一个例子:

现在,这里有一些有趣的观点.

Now, here's a couple of interesting points..

1)在开始使用MediaPlaybackList排队数据流之前,我使用了一种方法,从传入的套接字流中提取每个视频,将其作为StorageFile保存到本地磁盘,然后对这些StorageFile进行排队,播放将它们按顺序删除,然后将其删除(我可以在源代码管理中找到该代码的一个版本,可以挖掘出来,但是我不喜欢创建和销毁StorageFiles的想法,因为这似乎效率极低).但是,使用这种方法并不会导致我现在看到的破碎图像……这使我相信视频本身很好,并且可能是将其放回并流式传输到视频中的一个问题.媒体元素?

1) Before I started using a MediaPlaybackList to queue up streams of data, I was using a method of extracting each video from the incoming socket stream, saving it to the local disk as a StorageFile, then queueing up these StorageFiles, playing them in order and deleting them afterwards (I still have a version of this code in source control which I can dig out, but I don't like the idea of creating and destroying StorageFiles as it seems horrendously inefficient). However, using this method did not result in the broken pictures that I'm now seeing... this leads me to believe that the video itself is fine, and that perhaps it's an issue with the way it's being put back together and streamed to the Media Element?

2)我的女友的猫将网络摄像头(Microsoft Lifecam HD-3000)撞到了一边,但我没有意识到,直到我运行服务器并注意到图片呈90度角时,我才意识到.有趣的(令人费解的)事情是,交付给客户的图片没有像我上面描述的那样破裂.我可以看到的唯一区别是,图片的分辨率是480 x 640(从侧面的相机)而不是标准的640 x480.这意味着,我不确定...

2) My girlfriend's cat knocked the webcam (a Microsoft Lifecam HD-3000) onto its side without me realising, and I didn't realise until I ran the server and noticed the picture was at a 90 degree angle.. the interesting (and puzzling) thing about this was that the picture delivered to the client didn't break up as I've been describing above. The only difference here that I can see is that the picture then came through as 480 x 640 (from the camera sitting on its side), rather than the standard 640 x 480. What this means, I'm not sure...

关于问题的想法

Thoughts on the problem

  • 与视频的大小/尺寸有关(它在侧面播放得很好,所以与此有关)吗?
  • 与比特率有关?
  • 与在客户端重新组装字节的方式有关吗?
  • 与流的编码有关吗?

来源

Source

以下是我认为可能相关的一些代码片段,完整的解决方案源可以在GitHub上找到:视频套接字服务器

Here's a few snippets of code that I think are probably relevant, the full solution source can be found on GitHub, here: Video Socket Server .

服务器

Server

while (true)
{
    try
    {
        //record a 5 second video to stream
        Debug.WriteLine($"Recording started");
        var memoryStream = new InMemoryRandomAccessStream();
        await _mediaCap.StartRecordToStreamAsync(MediaEncodingProfile.CreateMp4(VideoEncodingQuality.Vga), memoryStream);
        await Task.Delay(TimeSpan.FromSeconds(5));
        await _mediaCap.StopRecordAsync();
        Debug.WriteLine($"Recording finished, {memoryStream.Size} bytes");

        //create a CurrentVideo object to hold stream data and give it a unique id
        //which the client app can use to ensure they only request each video once
        memoryStream.Seek(0);
        CurrentVideo.Id = Guid.NewGuid();
        CurrentVideo.Data = new byte[memoryStream.Size];

        //read the stream data into the CurrentVideo  
        await memoryStream.ReadAsync(CurrentVideo.Data.AsBuffer(), (uint)memoryStream.Size, InputStreamOptions.None);
        Debug.WriteLine($"Bytes written to stream");

        //signal to waiting connections that there's a new video
        _signal.Set();
        _signal.Reset();
    }
    catch (Exception ex)
    {
        Debug.WriteLine($"StartRecording -> {ex.Message}");
        break;
    }
}

连接

Connection

//use the guid to either get the current video, or wait for the 
//next new one that's added by the server
Guid guid = Guid.Empty;
Guid.TryParse(command, out guid);
byte[] data = _server.GetCurrentVideoDataAsync(guid);
if (data != null)
    await _socket.OutputStream.WriteAsync(data.AsBuffer());

客户端应用

Client App

byte[] inbuffer = new byte[10000000];

//block on the input stream until we've received the full packet,
//but use the Partial option so that we don't have to fill the entire buffer before we continue.
//this is important, because the idea is to set the buffer big enough to handle any packet we'll receive,
//meaning we'll never fill the entire buffer... and we don't want to block here indefinitely
result = await socket.InputStream.ReadAsync(inbuffer.AsBuffer(), inbuffer.AsBuffer().Capacity, InputStreamOptions.Partial);

//strip off the Guid, leaving just the video data
byte[] guid = result.ToArray().Take(16).ToArray();
byte[] data = result.ToArray().Skip(16).ToArray();
_guid = new Guid(guid);

//wrap the data in a stream, create a MediaSource from it,
//then use that to create a MediaPlackbackItem which gets added 
//to the back of the playlist...
var stream = new MemoryStream(data);
var source = MediaSource.CreateFromStream(stream.AsRandomAccessStream(), "video/mp4");
var item = new MediaPlaybackItem(source);
_playlist.Items.Add(item);

推荐答案

我正在尝试做类似的事情(来自Raspberry Pi上的UWP应用的流视频/音频),但我一直在使用来自Windows 10 SDK,经过一些调整,我已经能够可靠地工作(示例代码中存在线程同步问题).但是,SDK示例使用了带有媒体扩展名的专有协议,并且不容易通过Internet重定向流,这是我的用例,因此我查看了您的代码并使其生效(具有相同的错误).简单实时通信

I'm looking to do something similar (stream video/audio from a UWP app on a Raspberry Pi) but I have been using the simple-communications sample from the Windows 10 SDK which after a bit of tweaking I have been able to get working reliably (there are thread sync issues with the sample code). However the SDK sample uses a proprietary protocol using media extensions and it isn't easy to redirect the stream over the internet which is my use-case so I had a look at your code and got it working (with the same bugs). Simple Real Time communication

关于您的方法的几点评论:

A couple of comments on your approach:

1)RPi无法很好地在Win10上处理视频,因为它无法使用硬件视频编码器,因此软件中的所有操作也是如此.这将导致故障,并且我看到利用率超过50%时,CPU性能将显着提高,这意味着至少一个CPU内核的工作速率接近最大,可能是一个处理视频压缩到MP4的内核.但是,我运行了SDK示例,并获得了无故障查看和大约70%的CPU,因此您的问题很可能在其他地方.

1) The RPi can't process video on Win10 very well as it can't use the hardware video encoders so does everything in software. This will cause glitches and I see the CPU performance increasing significantly with over 50% utilisation which means at least one of the CPU cores is working close to max, possibly the one handling the video compression to MP4. However I ran up the SDK sample and got glitch free viewing and about 70% CPU so your problem is likely elsewhere.

2)5秒的延迟时间很长.我获得的实时采样延迟不到100mSec,但是当我将流式传输计时器调低至1秒时,分手非常明显且不可行.您是否考虑过更改设计,以便在捕获过程中流式传输,但是我不确定InMemoryRandomAccessStream是否可以让您执行此操作.另一种替代方法是捕获预览流,然后像简单通信"示例一样,将自定义媒体接收器写入缓冲区(很难执行而不是托管代码,并且可能无法轻松压缩).

2) 5 seconds of latency delay is significant. I get less than 100mSec latency with the real time sample however when I adjusted down your streaming timer to 1 second the breakup was significant and unworkable. Have you thought about changing the design so it streams during capture however I'm not sure the InMemoryRandomAccessStream will let you do that. Another alternative is to capture the preview stream and write a custom media sink to buffer (harder to do as not managed code and likely not able to compress as easily) like the Simple Communication sample does.

3)MP4是一种容器,不是压缩格式,并且不是为流而构建的,因为除非将moov元数据记录放在文件的开头,否则在开始之前必须下载整个文件.不确定UWP如何处理此问题,可能是您需要在发送之前关闭流的方法,以确保另一端可以正常播放它.

3) MP4 is a container not a compression format, and isn't built for streaming as the whole file has to be downloaded before it starts unless the moov metadata record is placed at the beginning of the file. Not sure how UWP handles this, likely your approach of closing off the stream before sending is required to ensure the other end can play it properly.

所以不是一个完整的答案,但希望以上内容能对您有所帮助.

So not a complete answer but hopefully the above helps.

这篇关于UWP-将WebCam通过套接字流式传输到MediaElement-图像破裂?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆