在后台捕获相机帧(Android) [英] Capture camera frames in background (Android)

查看:82
本文介绍了在后台捕获相机帧(Android)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的问题是:我想要一个后台服务,它将实时从相机获取帧,以便我可以对其进行分析.我在这里看到了许多类似的主题,据说可以解决此问题,但在我看来,这些主题都没有真正起作用.

My problem is this: I want a background service, that will obtain frames from the camera in real-time, so that I can analyze them. I've seen a lot of similar topics here that supposedly address this issue, but none of them has really worked in my case.

我的第一个尝试是创建一个Activity,以启动一个Service,然后在该Service中,我创建了surfaceView,从中获得了一个holder,并实现了对其的回调,在其中准备了相机和所有东西.然后,在PreviewCallback上,我可以创建一个新线程,该线程可以分析我从PreviewCallback的onPreviewFrame方法获取的数据.

My first attempt was to create an Activity, which started a Service, and inside the service I created a surfaceView, from which I got a holder and implemented a callback to it in which I prepared the camera and everything. Then, on a previewCallback, I could make a new thread, which could analyze the data I was getting from the onPreviewFrame method of PreviewCallback.

这很好,虽然我在前台拥有该服务,但是当我打开另一个应用程序(该服务仍在后台运行)时,我意识到预览不存在,所以我无法从中获取帧.

That worked well enough, while I had that service in the foreground, but as soon as I opened up another application (with the service still running in the background), I realized that the preview wasn't there so I couldn't get the frames from it.

在Internet上搜索时,我发现可以用SurfaceTexture解决此问题.因此,我创建了一个活动来启动我的服务,如下所示:

Searching on the internet, I found out I could perhaps solve this with SurfaceTexture. So, I created an Activity which'd start my service, like this:

public class SurfaceTextureActivity extends Activity {

public static TextureView mTextureView;

public static Vibrator mVibrator;   
public static GLSurfaceView mGLView;

protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    mGLView = new GLSurfaceView(this);

    mTextureView = new TextureView(this);

    setContentView(mTextureView);


    try {
        Intent intent = new Intent(SurfaceTextureActivity.this, RecorderService.class);
        intent.putExtra(RecorderService.INTENT_VIDEO_PATH, "/folder-path/");
        startService(intent);
        Log.i("ABC", "Start Service "+this.toString()+" + "+mTextureView.toString()+" + "+getWindowManager().toString());
    }
    catch (Exception e)  {
        Log.i("ABC", "Exc SurfaceTextureActivity: "+e.getMessage());
    }

}

}

然后使RecorderService实现SurfaceTextureListener,以便我可以打开相机并进行其他准备工作,然后捕获帧.我的RecorderService当前看起来像这样:

And then I made the RecorderService implement SurfaceTextureListener, so that I could open the camera and do the other preparations, and then perhaps capture the frames. My RecorderService currently looks like this:

public class RecorderService extends Service implements TextureView.SurfaceTextureListener, SurfaceTexture.OnFrameAvailableListener {

    private Camera mCamera = null;
    private TextureView mTextureView;
    private SurfaceTexture mSurfaceTexture;
    private float[] mTransformMatrix;

    private static IMotionDetection detector = null;
    public static Vibrator mVibrator;

    @Override
    public void onCreate() {
        try {

            mTextureView = SurfaceTextureActivity.mTextureView;
            mTextureView.setSurfaceTextureListener(this);

            Log.i("ABC","onCreate");

//          startForeground(START_STICKY, new Notification()); - doesn't work

        } catch (Exception e) {
            Log.i("ABC","onCreate exception "+e.getMessage());
            e.printStackTrace();
        }

    }

    @Override
    public void onFrameAvailable(SurfaceTexture surfaceTexture) 
    {
        //How do I obtain frames?!
//      SurfaceTextureActivity.mGLView.queueEvent(new Runnable() {
//          
//          @Override
//          public void run() {
//              mSurfaceTexture.updateTexImage();
//              
//          }
//      });
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width,
            int height) {

        mSurfaceTexture = surface;
        mSurfaceTexture.setOnFrameAvailableListener(this);
        mVibrator = (Vibrator)this.getSystemService(VIBRATOR_SERVICE);

         detector = new RgbMotionDetection();

        int cameraId = 0;
        Camera.CameraInfo info = new Camera.CameraInfo();

        for (cameraId = 0; cameraId < Camera.getNumberOfCameras(); cameraId++) {
            Camera.getCameraInfo(cameraId, info);
            if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT)
                break;
        }

        mCamera = Camera.open(cameraId);
        Matrix transform = new Matrix();

        Camera.Size previewSize = mCamera.getParameters().getPreviewSize();
        int rotation = ((WindowManager)(getSystemService(Context.WINDOW_SERVICE))).getDefaultDisplay()
                .getRotation();
        Log.i("ABC", "onSurfaceTextureAvailable(): CameraOrientation(" + cameraId + ")" + info.orientation + " " + previewSize.width + "x" + previewSize.height + " Rotation=" + rotation);

        try {

        switch (rotation) {
        case Surface.ROTATION_0: 
            mCamera.setDisplayOrientation(90);
            mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
                    previewSize.height, previewSize.width, Gravity.CENTER));
            transform.setScale(-1, 1, previewSize.height/2, 0);
            break;

        case Surface.ROTATION_90:
            mCamera.setDisplayOrientation(0);
            mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
                    previewSize.width, previewSize.height, Gravity.CENTER));
            transform.setScale(-1, 1, previewSize.width/2, 0);
            break;

        case Surface.ROTATION_180:
            mCamera.setDisplayOrientation(270);
            mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
                    previewSize.height, previewSize.width, Gravity.CENTER));
            transform.setScale(-1, 1, previewSize.height/2, 0);
            break;

        case Surface.ROTATION_270:
            mCamera.setDisplayOrientation(180);
            mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
                    previewSize.width, previewSize.height, Gravity.CENTER));
            transform.setScale(-1, 1, previewSize.width/2, 0);
            break;
        }

            mCamera.setPreviewTexture(mSurfaceTexture);


        Log.i("ABC", "onSurfaceTextureAvailable(): Transform: " + transform.toString());

        mCamera.startPreview();
//      mTextureView.setVisibility(0);

        mCamera.setPreviewCallback(new PreviewCallback() {

            @Override
            public void onPreviewFrame(byte[] data, Camera camera) {
                if (data == null) return;
                Camera.Size size = mCamera.getParameters().getPreviewSize();
                if (size == null) return;

                //This is where I start my thread that analyzes images
                DetectionThread thread = new DetectionThread(data, size.width, size.height);
                thread.start();

            }
        });
        } 
        catch (Exception t) {
             Log.i("ABC", "onSurfaceTextureAvailable Exception: "+ t.getMessage());
        }
    }

但是,与其他情况类似,由于我的分析线程从onSurfaceTextureAvailable内部开始,仅当纹理存在 时,而不是当我打开另一个应用程序时,捕获了帧当我打开其他东西时不会继续.

However, similarly as in the other case, since my analyzing thread starts inside the onSurfaceTextureAvailable, which is only when the texture is there, and not when I open up another application, the frame capturing won't continue when I open up something else.

一些想法表明这是可能的,但我只是不知道如何有一个想法,我可以实现SurfaceTexture.onFrameAvailable,然后在有新帧可用时,触发可运行对象在渲染线程(GLSurfaceView.queueEvent(..))上运行,最后运行可运行对象,调用SurfaceTexture.updateTexImage().这是我尝试过的(在我的代码中已注释掉),但是它不起作用,如果我这样做,应用程序将崩溃.

Some ideas have shown that it's possible, but I just don't know how. There was an idea, that I could implement SurfaceTexture.onFrameAvailable and then once new frame is available, trigger a runnable to be ran on render thread (GLSurfaceView.queueEvent(..)) and finally run a runnable call SurfaceTexture.updateTexImage(). Which is what I've tried (it's commented out in my code), but it doesn't work, the application crashes if I do that.

我还能做什么?我知道这可以以某种方式起作用,因为我已经看到它在SpyCameraOS之类的应用中使用过(是的,我知道它是开源的,并且我已经看过代码,但是我无法提供一个可行的解决方案),而且我感觉好像我在某个地方缺少一小块东西,但是我不知道自己在做什么错.我已经过去3天了,但没有成功.

What else can I possibly do? I know that this can work somehow, because I've seen it used in apps like SpyCameraOS (yes, I know it's open-source and I've looked at the code, but I couldn't make a working solution), and I feel like I'm missing just a small piece somewhere, but I have no idea what I'm doing wrong. I've been at this for the past 3 days, and no success.

我们将不胜感激.

推荐答案

总结注释:将Camera的输出定向到与View对象无关的SurfaceTexture.当活动暂停时,TextureView将被销毁,从而释放其SurfaceTexture,但是如果您创建单独的SurfaceTexture(或从TextureView中分离出一个),则它不会受到Activity状态更改的影响.可以将纹理渲染到屏幕外的Surface,从中可以读取像素.

Summarizing the comments: direct the output of the Camera to a SurfaceTexture that isn't tied to a View object. A TextureView will be destroyed when the activity is paused, freeing its SurfaceTexture, but if you create a separate SurfaceTexture (or detach the one from the TextureView) then it won't be affected by changes in Activity state. The texture can be rendered to an off-screen Surface, from which pixels can be read.

可以在 Grafika 中找到各种示例.

Various examples can be found in Grafika.

这篇关于在后台捕获相机帧(Android)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆