替代ffmpeg的iOS [英] Alternative to ffmpeg for iOS
问题描述
我花了很多时间与ffmpeg和opengl,我没有找到解决方案。
我正在寻找在ios设备上播放此文件的其他替代方法。
有人知道我可以用来播放这些文件的其他库,框架...。不要是否有营业执照。
非常感谢,
已编辑:
初始着色器:
shader = [[GLShader alloc] initWithFileName:@render [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:0],@position,
[NSNumber numberWithInt:1],@texCoords,nil]
uniforms:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:0],@sampler0,
[NSNumber numberWithInt:0],@viewProjectionMatrix,nil]];
render.fsv:
uniform sampler2D sampler0;
变化高压vec2 _texcoord;
void main()
{gl_FragColor = texture2D(sampler0,_texcoord);}
render.vsf:
属性vec4 position;
属性vec2 texCoords;
变化vec4 colorVarying;
变化vec2 _texcoord;
uniform mat4 viewProjectionMatrix;
void main()
{_texcoord = texCoords;
gl_Position = viewProjectionMatrix * position;}
如何在此代码中实现您的解决方案?
我遇到过类似的问题。有两个瓶颈:
- 解码
- 从yuv转换为rgb格式
我通过使用着色器转换图像解决了第二个问题。它现在工作非常快(我可以在iPad2上以30 fps的速度同时渲染6个视频)。
这是片段着色器的一部分:
uniform sampler2D y;
uniform sampler2D u;
uniform sampler2D v;
...
y = texture2D(y,vec2(nx,ny))。
u = texture2D(u,vec2(nx,ny))r - 0.5;
v = texture2D(v,vec2(nx,ny))。
r = y + 1.13983 * v;
g = y - 0.39465 * u - 0.58060 * v;
b = y + 2.03211 * u;
gl_FragColor = vec4(r,g,b,1.0);
注意:您必须将y,u,v组件存储在3种不同的纹理中。
nx和ny - 是归一化的纹理坐标(从0到1个纹理)。
I've been trying to implement ffmpeg in my ios app several weeks. And now I can play several avi files, but other files like a flv, wma, mp4... play slow.
I have spent much time with ffmpeg and opengl and I don't find solution.
I'm looking other alternatives to play this files on ios device.
Someone know other libraries, frameworks, ... that I can use to play this files. No matter if they have business licenses.
Very thanks,
Edited:
Init Shader:
shader = [[GLShader alloc] initWithFileName:@"render" attributes:[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:0], @"position",
[NSNumber numberWithInt:1], @"texCoords", nil]
uniforms:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:0], @"sampler0",
[NSNumber numberWithInt:0], @"viewProjectionMatrix",nil]];
render.fsv:
uniform sampler2D sampler0;
varying highp vec2 _texcoord;
void main()
{ gl_FragColor = texture2D(sampler0, _texcoord);}
render.vsf:
attribute vec4 position;
attribute vec2 texCoords;
varying vec4 colorVarying;
varying vec2 _texcoord;
uniform mat4 viewProjectionMatrix;
void main()
{ _texcoord = texCoords;
gl_Position = viewProjectionMatrix * position;}
How I can implement your solution in this code?
I've experienced similar problems. There were two bottlenecks:
- decoding
- conversion from yuv to rgb formats
I solved the second problem by converting image using shaders. It works really fast now (I can render 6 videos simulteneously at 30 fps on iPad2).
Here is part of the fragment shader:
uniform sampler2D y;
uniform sampler2D u;
uniform sampler2D v;
...
y = texture2D(y, vec2(nx,ny)).r;
u = texture2D(u, vec2(nx, ny)).r - 0.5;
v = texture2D(v, vec2(nx, ny)).r - 0.5;
r = y + 1.13983*v;
g = y - 0.39465*u - 0.58060*v;
b = y + 2.03211*u;
gl_FragColor = vec4(r, g, b, 1.0);
NOTE: you have to store y,u,v components in 3 different textures.
nx and ny - are normalized texture coordinates (from 0 to 1 texture ).
这篇关于替代ffmpeg的iOS的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!