流网络摄像头使用ffmpeg和live555 [英] stream webcam using ffmpeg and live555

查看:648
本文介绍了流网络摄像头使用ffmpeg和live555的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述



我想使用ffmpeg作为编码器将家庭LAN后面的Windows 7(64位)机器中的网络摄像头流式传输到live555服务器通过WAN在数据中心的Debian 64位linux机器上运行。我想从ffmpeg发送一个H.264 RTP / UDP流,testOnDemandRTSPServer应该将RTSP流发送给连接到它的客户端。



我正在使用以下ffmpeg命令将UDP数据发送到端口1234,IP地址AA.BB.CC.DD

  .\ffmpeg.exe -f dshow -i video =Webcam C170:audio =Microphone(3- Webcam C170)-an 
-vcodec libx264 -f mpegts udp://AA.BB.CC.DD:1234 $ b在linux服务器上,我正在运行端口5555上的testOnDemandRTSPServer,它期望来自AA的原始UDP数据:BB: CC:DD:1234。我尝试使用 rtsp://AA.BB.CC.DD:5555 / mpeg2TransportStreamFromUDPSourceTest



在VLC中打开rtsp流但是我在VLC中什么也没有。我究竟做错了什么?我如何解决它?

解决方案

从我记得,写一个DeviceSource类是不重要的,问题你'description描述绝对是在live555邮件列表中经常被讨论的东西 - 如果你想做任何与rtsp开发有关的事情,你需要自己批准列表asap。



您似乎遇到的问题与以下事实有关:某些视频格式是以流式传输方式编写的,rtsp服务器可以轻松地流式传输某些格式,因为它们包含同步字节和其他可以使用的标记确定帧边界在哪里结束。您可以使用的最简单的解决方案是让您的手上的相机的SDK,并使用它来请求相机的数据。有许多不同的库和工具包可以让您从相机访问数据 - 其中一个是DirectX SDK。一旦拥有相机数据,您需要将其编码为可流式格式,您可能可以使用DirectX获取原始相机镜头,然后使用ffmpeg(libavcodec,libavformat)将其转换为mp4 / h264帧数据。 p>

一旦你拥有编码的帧数据,你可以将它提供给你的DeviceSource类,它会为你处理数据流。我希望我手头有代码,但是我受到NDA的约束,不能从现场删除代码,尽管通用算法记录在live555网站上,所以我可以在这里解释一下。



我希望你有更多的运气。如果你卡住了,那么请记住在你的问题中添加代码。现在,唯一停止原始计划的工作(流文件到VLC)是您选择流式传输的文件格式。


I am new to live555.

I want to stream my webcam from a windows 7 (64-bit) machine behind home LAN using ffmpeg as the encoder to a live555 server running on a Debian 64-bit linux machine in a data center over the WAN. I want to send a H.264 RTP/UDP stream from ffmpeg and the "testOnDemandRTSPServer" should send out RTSP streams to clients that connect to it.

I am using the following ffmpeg command which sends UDP data to port 1234, IP address AA.BB.CC.DD

.\ffmpeg.exe -f dshow -i video="Webcam C170":audio="Microphone (3- Webcam C170)" -an 
 -vcodec libx264 -f mpegts udp://AA.BB.CC.DD:1234

On the linux server I am running the testOnDemandRTSPServer on port 5555 which expects raw UDP data from from AA:BB:CC:DD:1234. I try to open the rtsp stream in VLC using rtsp://AA.BB.CC.DD:5555/mpeg2TransportStreamFromUDPSourceTest

But I get nothing in VLC. What am I doing wrong? How can I fix it?

解决方案

From what I remember, it was non-trivial to write a DeviceSource class, the problem you're describing is definitely something that's discussed quite frequently on the live555 mailing list - you need to get yourself approved to the list a.s.a.p if you want to do anything related to rtsp development.

The problem you seem to be having is related to the fact that some video formats are written with streaming in mind, and the rtsp server can easily stream certain formats because they contain "sync bytes" and other 'markers' which it can use to determine where frame boundaries end. The simplest solution you could use is to get your hands on the SDK for the camera, and use that to request data from the camera. There are many different libraries and toolkits that let you access data from the camera - one of which would be the DirectX SDK. Once you have the camera data, you would need to encode it into a streamable format, you might be able to get the raw camera frames using DirectX, then convert that to mp4 / h264 frame data using ffmpeg (libavcodec, libavformat).

Once you have your encoded frame data, you feed that into your DeviceSource class, and it will take care of streaming the data for you. I wish I had code on hand, but I was bound by NDA to not remove code from the premises, although the general algorithm is documented on the live555 website, so I am able to explain it here.

I hope you have a bit more luck with this. If you get stuck, then remember to add code to your question. Right now the only thing that's stopping your original plan from working (stream file to VLC) is the file format you chose to stream.

这篇关于流网络摄像头使用ffmpeg和live555的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆