Android的处理视频,帧的YCrCb视频 [英] Android processing a video, YCrCb frames to video

查看:213
本文介绍了Android的处理视频,帧的YCrCb视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我要处理,我通过改变像素的颜色在Android手机上录制的视频。我知道该怎么做,我要处理,但我不知道把一切在一起的最佳方式。

I want to process a video that I record on an Android phone by changing the color of the pixels. I know how to do the processing that I want, but I don't know the best way of putting everything together.

到目前为止,我的的事情能做是:

So far the things that I can do are:


  • 从一个视频作为的YCrCb(YUV)获取单个帧

  • 修改的YCrCb我多么希望字节

  • 转换或YCrCb到RGB阵列

  • RGB阵列转换为位图

我怎样才能把一切融合在一起,并重新保存为我能在Android上播放视频?
我可以从YCrCb的信息直行?或者我需要转换为RGB或其他格式第一?

How can I put everything together and save it back as a video that I can playback on Android? Can i go straight from the YCrCb information? Or do I need to convert to RGB or some other format first?

我并不需要这样做实时,目的是录制视频,保存它,然后进行处理和保存处理的视频。
不过,我确实有一点速度。
是他们,我可以用它来加快进程库?我知道Android提供使用OpenGL和我心中已经读关于ffmpeg的,但我不能确定的最佳路径服用。

I don't need to do this in real time, the aim would be to record a video, save it, then process it and save the processed video. However I do have speed in mind. Are their libraries that I can use to speed up the process? I know Android provides use of OpenGL and I'v read about ffmpeg but I'm unsure of the best path to take.

如果有人能指出我在正确的方向我想AP preciate吧! = D

If someone could point me in the right direction I would appreciate it! =D

谢谢! = D

推荐答案

这从相机中得到的帧uncom pressed。这一般不视频存储,因为uncom pressed帧是太大,由玩家可以读取并实时显示的方式。相反,你想要做的EN code您的YUV或RGB帧,使帧小得多,在播放过程中的I / O负载很小。

The frames that you get from the camera are uncompressed. This is typically not the way video is stored, since uncompressed frames are too big to be read and displayed in real time by a player. Instead, you want to do encode your yuv or rgb frames, so that the frames are much smaller and the I/O load during playback is small.

这是你应该使用的编码格式必须是你的设备可以取消在硬件code,如CPU的解码也贵得离谱。这些天来,你应该是好的,如果你使用的H.264的视频和AAC的音频,并使用MP4容器。不幸的是如果包括H.264 / AAC连接在你的应用程序codeRS(甚至是开源的),并打算从中赚钱,那么你可能会得到从的 MPEG LA ,谁可能想从你收集的,因为H.264和AAC需要得到许可。我相信如果你使用已获得许可并安装在设备中,如所使用的相机应用记录下一个codeC本是可以避免的。我不知道在Android平台允许你访问摄像机codeCS。

The encoding format that you should use must be one that your device can decode in hardware, as decoding in CPU is also prohibitively expensive. These days, you should be good if you use H.264 for the video and AAC for the audio, and use an MP4 container. Unfortunately if you include H.264/AAC encoders (even open source ones) in your app and plan to make money from it, then you may get a visit from the MPEG LA, who may want to collect from you, since both H.264 and AAC require licensing. I believe this is avoided if you use a codec that is already licensed and installed in the device, like the one that is used by the camera app to record. I do not know if the Android platform allows you to access the camera codecs.

据库,我想从FFmpeg的项目libav codeC和libavformat流是最完整的。有了这些库可以连接code H.264 / AAC / MP4文件,假设发牌问题不会影响到你。

As far as libraries, I think libavcodec and libavformat from the ffmpeg project are the most complete. With these libs you can encode H.264/AAC/MP4 files, assuming the licensing problem doesn't affect you.

这篇关于Android的处理视频,帧的YCrCb视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆