AR无人机2和ffserver + ffmpeg流 [英] AR Drone 2 and ffserver + ffmpeg streaming

查看:226
本文介绍了AR无人机2和ffserver + ffmpeg流的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想能够将AR Drone 2的视频馈送从Debian服务器转换为Flash。



我知道AR无人机使用编解码器P264。对于视频编解码器,我完全是绿色的,所以我不知道要实现的目标是什么?



我已经能够相比之下,当我直接使用ffplay连接到AR Drone时,从AR Drone流式传输视频Feed,但具有非常高的延迟和极低的质量。



我目前使用.swf标准ffserver.conf中的示例:

 < Stream test.swf> 
Feed feed1.ffm
格式swf
VideoFrameRate 30
VideoIntraOnly
NoAudio
< / Stream>

.ffm Feed的设置如下:

 < Feed feed1.ffm> 
文件/tmp/feed1.ffm
FileMaxSize 17K
ACL允许127.0.0.1
NoAudio
< / Feed>

我用于输入ffserver feed的命令:

  ffmpeg -i http://192.168.1.1:5555 http:// localhost:8090 / feed1.ffm 
/ pre>

我能如何实现更低的延迟和更高的质量,因为流是目前不可用的?

解决方案

不幸的是,ffserver根本不会得到你想要完成的工作。你和互联网上的其他人一样打墙。我可以得到的最好是大约3秒的延迟,随着流运行几个小时,逐渐增加到大约5-10秒。



流不是解码ffmpeg对我也是。我不知道为什么。它与ffplay一起工作,这只是让我更混淆!



我正在研究Py-Media,看看我是否可以为类似的项目编写自己的代码。我想流式传输ardrone视频,并在流中缩小图像。



p.s。看看gstreamer,我看到其他人讨论它产生不同的结果。


I want to be able to restream the video feed of the AR Drone 2 from a Debian Server to Flash.

I am aware that the AR Drone uses the codec p264. I'm totally green when it comes to video codecs, so I don't know what will be suitable for the goal I want to achieve?

I have been able to stream the video feed from the AR Drone but with very high latency and extremely low quality, compared to when I directly connect to the AR Drone using ffplay.

I currently use the .swf example in the standard ffserver.conf:

<Stream test.swf>
Feed feed1.ffm
Format swf
VideoFrameRate 30
VideoIntraOnly
NoAudio
</Stream>

And the settings for the .ffm Feed are as follows:

<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 17K
ACL allow 127.0.0.1
NoAudio
</Feed>

The command I use for giving input to the ffserver feed:

ffmpeg -i http://192.168.1.1:5555 http://localhost:8090/feed1.ffm

How am I able to achieve lower latency and higher quality, since the stream is currently unwatchable?

解决方案

Unfortunately ffserver just simply will not get the job you want done. You have hit the same wall as everyone else on the internet. The best I can get is about 3 second delay which gradually increases to about 5-10 seconds as the stream runs for a few hours.

the stream isn't decoding with ffmpeg for me also. I do not know why. it works with ffplay which just confuses me more!

I am looking into Py-Media to see if i can just write my own code for a similar project. I want to stream ardrone video and minipulate the images in the stream.

p.s. look into gstreamer, I saw others discussing it yields different results.

这篇关于AR无人机2和ffserver + ffmpeg流的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆