使用FFmpeg的Andr​​oid客户端上的RTSP流 [英] RTSP streaming on Android client using FFMpeg

查看:286
本文介绍了使用FFmpeg的Andr​​oid客户端上的RTSP流的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的一个爱好项目工作其目的是开发能够通过流在一个LAN网络摄像头使用FFmpeg的作为底层引擎设置捕获的实时饲料的Andr​​oid应用程序。到目前为止,我做了以下 -

I am working on a hobby project the goal for which is to develop an Android application capable of streaming live feeds captured through web cams in a LAN setting using FFMpeg as the underlying engine. So far, I did the following -

一个。以下版本的编译和产生FFmpeg的相关库 -

A. Compiling and generating FFMpeg related libraries for the following releases -

FFmpeg的版本:2.0

NDK版本:R8E&放大器; R9

Android平台版本:Android的16安培; Android的18thisthisthisthis

工具链版本:4.6安培; 4.8

平台建立在:Fedora的18(x86_64的)

FFMpeg version: 2.0
NDK version: r8e & r9
Android Platform version: android-16 & android-18thisthisthisthis
Toolchain version: 4.6 & 4.8
Platform built on: Fedora 18 (x86_64)

乙。创建文件Android.mk&安培; Application.mk在适当的路径。

B. Creating the files Android.mk & Application.mk in appropriate path.

然而,当它来编写本地code从使用Java应用层访问的ffmpeg相应的功能,我坚持用以下问题 -

However, when it came to writing the native code for accessing appropriate functionality of FFMpeg from the application layer using Java, I'm stuck with following questions -

A),它的所有的FFmpeg的功能我需要从原生到应用层流媒体实时反馈?

b)为了编译FFmpeg的为Android,我也跟着链接。无论编译选项足以处理* .sdp流或者是否需要修改?

三)我需要利用LIVE555?

a) Which all of FFMpeg's features I need to make available from native to app layer for streaming real-time feeds?
b) In order to compile FFMpeg for Android, I followed this link. Whether the compilation options are sufficient for handling *.sdp streams or do I need to modify it?
c) Do I need to make use of live555?

我完全新FFmpeg的和Android应用程序的开发,这将是对Android平台我的第一个严重的项目。我一直在寻找对付使用FFmpeg的,而现在没有多少成功RTSP流媒体相关的教程。此外,我试过VLC播放器的最新开发版本,并发现它是非常适合实时流源。然而,这是一个复杂的野兽,为我的项目的目标是相当有限的性质,大多是学习 - 在很短的时间跨度

I am totally new to FFMpeg and Android application development and this is going to be my first serious project for Android platform. I have been searching for relevant tutorials dealing with RTSP streaming using FFMpeg for a while now without much success. Moreover, I tried the latest development build of VLC player and found it to be great for streaming real-time feeds. However, it's a complex beast and the goal for my project is of quite limited nature, mostly learning - in a short time span.

您能告诉我如何能写本机code为利用FFmpeg的图书馆,并随后使用来自应用层的功能,流媒体实时提要一些指针(如链接,文档或样品code) ?此外,将真正AP preciate,如果你可以让我知道需要从功能角度来看这个项目的那种背景知识(在一个语言无关的意义上)。

Could you suggest some pointers (e.g. links, documents or sample code) on how can I write the native code for utilizing FFMpeg library and subsequently use those functionality from the app layer for streaming real-time feeds? Moreover, will really appreciate if you could let me know the kind of background knowledge necessary for this project from a functional standpoint (in a language agnostic sense).

推荐答案

我是在一个类似的情况,前一段时间(我想流从RTMP服务器的MP3),这是非常令人沮丧的。不过,我设法凑了些code,实际上做了什么,它应该。一些指针:

I was in a similar situation some time ago (I wanted to stream an mp3 from an RTMP server) and it was extremely frustrating. However, I managed to scrape together some code that actually did what it was supposed to. Some pointers:


  • 您不想FFmpeg的API暴露给你的Java code。相反,考虑创建像 openRTSPStream辅助功能(字符串URL)并保持ffmpeg的东西在你的C / C ++ code。我这样说是因为做的ffmpeg的的使用指针和动态内存分配,这将使它成为疼痛,试图从Java中使用它。

  • You don't want to expose ffmpeg's API to your Java code. Instead, consider creating helper functions like openRTSPStream(String url) and keep the ffmpeg stuff in your C/C++ code. I say this because ffmpeg makes heavy use of pointers and dynamic memory allocation that would make it a pain to try and use it from Java.

您用来编译库中的脚本使用标记 - 禁用 - 一切这也意味着它可能禁用RTSP支持。我建议你​​要么删除该标志或运行配置与脚本 - 列表协议 - 列表的分路器 - 列表的合并器 - 列表恩codeR - 列表德codeR (或类似的规定),以获得您需要启用什么的想法。你需要牢记的格式和视频编码和音频,哪些是你将其解码为。

The script you used to compile the library uses the flag --disable-everything which also means that it probably disables RTSP support. I'd recommend that you either remove that flag or run the configure script with --list-protocol, --list-demuxer, --list-muxer, --list-encoder, and --list-decoder (or something along those lines) to get an idea of what you need to enable. You need to keep in mind the format and encoding of the video and the audio and what you will be decoding it to.

当你从流中读取数据包,您的母语code可以通过一个回调函数,会依次显示缓冲区作为视频/音频发送缓冲区到Java code。

While you are reading the packets from the stream, your native code could send buffers to your Java code through a callback function which would in turn display the buffers as video/audio.

下面是另一个SO后您可能感兴趣的:记录RTSP流与FFmpeg的libavformat流

Here is another SO post that might interest you: Record RTSP stream with FFmpeg libavformat

让我知道,如果你需要一些示例code或进一步澄清。

Let me know if you need some sample code or further clarification.

这篇关于使用FFmpeg的Andr​​oid客户端上的RTSP流的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆