Android camera2 输出到 ImageReader 格式 YUV_420_888 仍然很慢 [英] Android camera2 output to ImageReader format YUV_420_888 still slow

查看:57
本文介绍了Android camera2 输出到 ImageReader 格式 YUV_420_888 仍然很慢的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图让 Android camera2 在后台服务中运行,然后在回调 ImageReader.OnImageAvailableListener 中处理帧.我已经使用建议的原始格式 YUV_420_888 来获得最大 fps,但是在 640x480 分辨率下我只能获得大约 7fps.这甚至比我使用旧的 Camera 接口(我想升级到 Camera2 以获得更高的 fps)或使用 OpenCV JavaCameraView(我无法使用它,因为我需要在后台服务中运行处理)所获得的速度还要慢.

I'm trying to get the Android camera2 running in the background service, then process the frame in the callback ImageReader.OnImageAvailableListener. I already use the suggested raw format YUV_420_888 to get max fps, however I only get around 7fps on the resolution 640x480. This is even slower than what I get using the old Camera interface( I want to upgrade to Camera2 to get higher fps ) or with the OpenCV JavaCameraView( I can't use this because I need to run processing in the background service ).

下面是我的服务类.我错过了什么?

Below is my service class. What am I missing?

我的手机是运行Android 5.0.2的红米Note 3

My phone is Redmi Note 3 running Android 5.0.2

public class Camera2ServiceYUV extends Service {
    protected static final String TAG = "VideoProcessing";
    protected static final int CAMERACHOICE = CameraCharacteristics.LENS_FACING_BACK;
    protected CameraDevice cameraDevice;
    protected CameraCaptureSession captureSession;
    protected ImageReader imageReader;

    // A semaphore to prevent the app from exiting before closing the camera.
    private Semaphore mCameraOpenCloseLock = new Semaphore(1);


    public static final String RESULT_RECEIVER = "resultReceiver";
    private static final int JPEG_COMPRESSION = 90;

    public static final int RESULT_OK = 0;
    public static final int RESULT_DEVICE_NO_CAMERA= 1;
    public static final int RESULT_GET_CAMERA_FAILED = 2;
    public static final int RESULT_ALREADY_RUNNING = 3;
    public static final int RESULT_NOT_RUNNING = 4;

    private static final String START_SERVICE_COMMAND = "startServiceCommands";
    private static final int COMMAND_NONE = -1;
    private static final int COMMAND_START = 0;
    private static final int COMMAND_STOP = 1;

    private boolean mRunning = false;
    public Camera2ServiceYUV() {
    }

    public static void startToStart(Context context, ResultReceiver resultReceiver) {
        Intent intent = new Intent(context, Camera2ServiceYUV.class);
        intent.putExtra(START_SERVICE_COMMAND, COMMAND_START);
        intent.putExtra(RESULT_RECEIVER, resultReceiver);
        context.startService(intent);
    }

    public static void startToStop(Context context, ResultReceiver resultReceiver) {
        Intent intent = new Intent(context, Camera2ServiceYUV.class);
        intent.putExtra(START_SERVICE_COMMAND, COMMAND_STOP);
        intent.putExtra(RESULT_RECEIVER, resultReceiver);
        context.startService(intent);
    }

    // SERVICE INTERFACE
    @Override
    public int onStartCommand(Intent intent, int flags, int startId) {
        switch (intent.getIntExtra(START_SERVICE_COMMAND, COMMAND_NONE)) {
            case COMMAND_START:
                startCamera(intent);
                break;
            case COMMAND_STOP:
                stopCamera(intent);
                break;
            default:
                throw new UnsupportedOperationException("Cannot start the camera service with an illegal command.");
        }

        return START_STICKY;



    }

    @Override
    public void onDestroy() {
        try {
            captureSession.abortCaptures();
        } catch (CameraAccessException e) {
            Log.e(TAG, e.getMessage());
        }
        captureSession.close();
    }

    @Override
    public IBinder onBind(Intent intent) {
        return null;
    }


    // CAMERA2 INTERFACE
    /**
     * 1. The android CameraManager class is used to manage all the camera devices in our android device
     * Each camera device has a range of properties and settings that describe the device.
     * It can be obtained through the camera characteristics.
     */
    public void startCamera(Intent intent) {

        final ResultReceiver resultReceiver = intent.getParcelableExtra(RESULT_RECEIVER);

        if (mRunning) {
            resultReceiver.send(RESULT_ALREADY_RUNNING, null);
            return;
        }
        mRunning = true;

        CameraManager manager = (CameraManager) getSystemService(CAMERA_SERVICE);
        try {
            if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
                throw new RuntimeException("Time out waiting to lock camera opening.");
            }
            String pickedCamera = getCamera(manager);
            Log.e(TAG,"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA " + pickedCamera);
            manager.openCamera(pickedCamera, cameraStateCallback, null);
            CameraCharacteristics characteristics = manager.getCameraCharacteristics(pickedCamera);
            Size[] jpegSizes = null;
            if (characteristics != null) {
                jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.YUV_420_888);
            }
            int width = 640;
            int height = 480;
//            if (jpegSizes != null && 0 < jpegSizes.length) {
//                width = jpegSizes[jpegSizes.length -1].getWidth();
//                height = jpegSizes[jpegSizes.length - 1].getHeight();
//            }
//            for(Size s : jpegSizes)
//            {
//                Log.e(TAG,"Size = " + s.toString());
//            }


            // DEBUG
            StreamConfigurationMap map = characteristics.get(
                    CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
            if (map == null) {
                return;
            }
            Log.e(TAG,"Width = " + width + ", Height = " + height);
            Log.e(TAG,"output stall duration = " + map.getOutputStallDuration(ImageFormat.YUV_420_888, new Size(width,height)) );
            Log.e(TAG,"Min output stall duration = " + map.getOutputMinFrameDuration(ImageFormat.YUV_420_888, new Size(width,height)) );

//            Size[] sizeList = map.getInputSizes(ImageFormat.YUV_420_888);
//            for(Size s : sizeList)
//            {
//                Log.e(TAG,"Size = " + s.toString());
//            }

            imageReader = ImageReader.newInstance(width, height, ImageFormat.YUV_420_888, 2 /* images buffered */);
            imageReader.setOnImageAvailableListener(onImageAvailableListener, null);
            Log.i(TAG, "imageReader created");
        } catch (CameraAccessException e) {
            Log.e(TAG, e.getMessage());
            resultReceiver.send(RESULT_DEVICE_NO_CAMERA, null);
        }catch (InterruptedException e) {
            resultReceiver.send(RESULT_GET_CAMERA_FAILED, null);
            throw new RuntimeException("Interrupted while trying to lock camera opening.", e);
        }
        catch(SecurityException se)
        {
            resultReceiver.send(RESULT_GET_CAMERA_FAILED, null);
            throw new RuntimeException("Security permission exception while trying to open the camera.", se);
        }

        resultReceiver.send(RESULT_OK, null);
    }

    // We can pick the camera being used, i.e. rear camera in this case.
    private String getCamera(CameraManager manager) {
        try {
            for (String cameraId : manager.getCameraIdList()) {
                CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
                int cOrientation = characteristics.get(CameraCharacteristics.LENS_FACING);
                if (cOrientation == CAMERACHOICE) {
                    return cameraId;
                }
            }
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        return null;
    }


    /**
     * 1.1 Callbacks when the camera changes its state - opened, disconnected, or error.
     */
    protected CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() {
        @Override
        public void onOpened(@NonNull CameraDevice camera) {
            Log.i(TAG, "CameraDevice.StateCallback onOpened");
            mCameraOpenCloseLock.release();
            cameraDevice = camera;
            createCaptureSession();
        }

        @Override
        public void onDisconnected(@NonNull CameraDevice camera) {
            Log.w(TAG, "CameraDevice.StateCallback onDisconnected");
            mCameraOpenCloseLock.release();
            camera.close();
            cameraDevice = null;
        }

        @Override
        public void onError(@NonNull CameraDevice camera, int error) {
            Log.e(TAG, "CameraDevice.StateCallback onError " + error);
            mCameraOpenCloseLock.release();
            camera.close();
            cameraDevice = null;
        }
    };


    /**
     * 2. To capture or stream images from a camera device, the application must first create
     * a camera capture captureSession.
     * The camera capture needs a surface to output what has been captured, in this case
     * we use ImageReader in order to access the frame data.
     */
    public void createCaptureSession() {
        try {
            cameraDevice.createCaptureSession(Arrays.asList(imageReader.getSurface()), sessionStateCallback, null);
        } catch (CameraAccessException e) {
            Log.e(TAG, e.getMessage());
        }
    }

        protected CameraCaptureSession.StateCallback sessionStateCallback = new CameraCaptureSession.StateCallback() {
        @Override
        public void onConfigured(@NonNull CameraCaptureSession session) {
            Log.i(TAG, "CameraCaptureSession.StateCallback onConfigured");

            // The camera is already closed
            if (null == cameraDevice) {
                return;
            }

            // When the captureSession is ready, we start to grab the frame.
            Camera2ServiceYUV.this.captureSession = session;

            try {
                session.setRepeatingRequest(createCaptureRequest(), null, null);
            } catch (CameraAccessException e) {
                Log.e(TAG, e.getMessage());
            }
        }

        @Override
        public void onConfigureFailed(@NonNull CameraCaptureSession session) {
            Log.e(TAG, "CameraCaptureSession.StateCallback onConfigureFailed");
        }
    };

    /**
     * 3. The application then needs to construct a CaptureRequest, which defines all the capture parameters
     *    needed by a camera device to capture a single image.
     */
    private CaptureRequest createCaptureRequest() {
        try {
            /**
             * Check other templates for further details.
             * TEMPLATE_MANUAL = 6
             * TEMPLATE_PREVIEW = 1
             * TEMPLATE_RECORD = 3
             * TEMPLATE_STILL_CAPTURE = 2
             * TEMPLATE_VIDEO_SNAPSHOT = 4
             * TEMPLATE_ZERO_SHUTTER_LAG = 5
             *
             * TODO: can set camera features like auto focus, auto flash here
             * captureRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
             */
            CaptureRequest.Builder captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
//            captureRequestBuilder.set(CaptureRequest.EDGE_MODE,
//                    CaptureRequest.EDGE_MODE_OFF);
//            captureRequestBuilder.set(
//                    CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE,
//                    CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE_ON);
//            captureRequestBuilder.set(
//                    CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE,
//                    CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE_OFF);
//            captureRequestBuilder.set(CaptureRequest.NOISE_REDUCTION_MODE,
//                    CaptureRequest.NOISE_REDUCTION_MODE_OFF);
//            captureRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
//                    CaptureRequest.CONTROL_AF_TRIGGER_CANCEL);
//
//            captureRequestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
//            captureRequestBuilder.set(CaptureRequest.CONTROL_AWB_LOCK, true);

            captureRequestBuilder.addTarget(imageReader.getSurface());
            return captureRequestBuilder.build();
        } catch (CameraAccessException e) {
            Log.e(TAG, e.getMessage());
            return null;
        }
    }


    /**
     * ImageReader provides a surface for the camera to output what has been captured.
     * Upon the image available, call processImage() to process the image as desired.
     */
    private long frameTime = 0;
    private ImageReader.OnImageAvailableListener onImageAvailableListener = new ImageReader.OnImageAvailableListener() {
        @Override
        public void onImageAvailable(ImageReader reader) {
            Log.i(TAG, "called ImageReader.OnImageAvailable");
            Image img = reader.acquireLatestImage();
            if (img != null) {
                if( frameTime != 0 )
                {
                    Log.e(TAG, "fps = " + (float)(1000.0 / (float)(SystemClock.elapsedRealtime() - frameTime)) + " fps");
                }
                frameTime = SystemClock.elapsedRealtime();
                img.close();
            }
        }
    };

    private void processImage(Image image) {
        Mat outputImage = imageToMat(image);
        Bitmap bmp = Bitmap.createBitmap(outputImage.cols(), outputImage.rows(), Bitmap.Config.ARGB_8888);
        Utils.bitmapToMat(bmp, outputImage);
        Point mid = new Point(0, 0);
        Point inEnd = new Point(outputImage.cols(), outputImage.rows());
        Imgproc.line(outputImage, mid, inEnd, new Scalar(255, 0, 0), 2, Core.LINE_AA, 0);
        Utils.matToBitmap(outputImage, bmp);

        Intent broadcast = new Intent();
        broadcast.setAction("your_load_photo_action");
        broadcast.putExtra("BitmapImage", bmp);
        sendBroadcast(broadcast);
    }

    private Mat imageToMat(Image image) {
        ByteBuffer buffer;
        int rowStride;
        int pixelStride;
        int width = image.getWidth();
        int height = image.getHeight();
        int offset = 0;

        Image.Plane[] planes = image.getPlanes();
        byte[] data = new byte[image.getWidth() * image.getHeight() * ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8];
        byte[] rowData = new byte[planes[0].getRowStride()];

        for (int i = 0; i < planes.length; i++) {
            buffer = planes[i].getBuffer();
            rowStride = planes[i].getRowStride();
            pixelStride = planes[i].getPixelStride();
            int w = (i == 0) ? width : width / 2;
            int h = (i == 0) ? height : height / 2;
            for (int row = 0; row < h; row++) {
                int bytesPerPixel = ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8;
                if (pixelStride == bytesPerPixel) {
                    int length = w * bytesPerPixel;
                    buffer.get(data, offset, length);

                    // Advance buffer the remainder of the row stride, unless on the last row.
                    // Otherwise, this will throw an IllegalArgumentException because the buffer
                    // doesn't include the last padding.
                    if (h - row != 1) {
                        buffer.position(buffer.position() + rowStride - length);
                    }
                    offset += length;
                } else {

                    // On the last row only read the width of the image minus the pixel stride
                    // plus one. Otherwise, this will throw a BufferUnderflowException because the
                    // buffer doesn't include the last padding.
                    if (h - row == 1) {
                        buffer.get(rowData, 0, width - pixelStride + 1);
                    } else {
                        buffer.get(rowData, 0, rowStride);
                    }

                    for (int col = 0; col < w; col++) {
                        data[offset++] = rowData[col * pixelStride];
                    }
                }
            }
        }

        // Finally, create the Mat.
        Mat mat = new Mat(height + height / 2, width, CV_8UC1);
        mat.put(0, 0, data);

        return mat;
    }


    private void stopCamera(Intent intent) {
        ResultReceiver resultReceiver = intent.getParcelableExtra(RESULT_RECEIVER);

        if (!mRunning) {
            resultReceiver.send(RESULT_NOT_RUNNING, null);
            return;
        }

        closeCamera();

        resultReceiver.send(RESULT_OK, null);

        mRunning = false;
        Log.d(TAG, "Service is finished.");
    }

    /**
     * Closes the current {@link CameraDevice}.
     */
    private void closeCamera() {
        try {
            mCameraOpenCloseLock.acquire();
            if (null != captureSession) {
                captureSession.close();
                captureSession = null;
            }
            if (null != cameraDevice) {
                cameraDevice.close();
                cameraDevice = null;
            }
            if (null != imageReader) {
                imageReader.close();
                imageReader = null;
            }
        } catch (InterruptedException e) {
            throw new RuntimeException("Interrupted while trying to lock camera closing.", e);
        } finally {
            mCameraOpenCloseLock.release();
        }
    }
}

推荐答案

最近我在尝试将我的AR应用从camera1升级到camera2 API时遇到了这个问题,我使用了一个中端设备进行测试(魅族S6)它具有 Exynos 7872 CPU 和 Mali-G71 GPU.我想要实现的是稳定的 30fps AR 体验.但是通过迁移,我发现使用 Camera2 API 获得合适的预览帧率非常棘手.

I bumped into this problem recently when I try to upgrade my AR app from camera1 to camera2 API, I used a mid-range device for testing (Meizu S6) which has Exynos 7872 CPU and Mali-G71 GPU. What I want to achieve is a steady 30fps AR experience. But through the migration I found that its quite tricky to get a decent preview frame rate using Camera2 API.

我使用 TEMPLATE_PREVIEW

mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);

然后我放置了 2 个表面,一个用于预览,它是一个大小为 1280x720 的表面纹理,另一个 ImageReader,大小为 1280x720,用于图像处理.

Then I Put 2 surfaces, one for preview which is a surfaceTexture at size 1280x720, another ImageReader at size 1280x720 for image processing.

mImageReader = ImageReader.newInstance(
    mVideoSize.getWidth(),
    mVideoSize.getHeight(),
    ImageFormat.YUV_420_888,
    2);

List<Surface> surfaces =new ArrayList<>();
Surface previewSurface = new Surface(mSurfaceTexture);
surfaces.add(previewSurface);
mPreviewBuilder.addTarget(previewSurface);

Surface frameCaptureSurface = mImageReader.getSurface();
surfaces.add(frameCaptureSurface);
mPreviewBuilder.addTarget(frameCaptureSurface);

mPreviewBuilder.set(CaptureRequest.CONTROL_AF_MODE,
                    CameraMetadata.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
mPreviewSession.setRepeatingRequest(mPreviewBuilder.build(), captureCallback, mBackgroundHandler);

一切都按预期进行,我的 TextureView 得到更新并且 framecallback 也被调用除了......帧速率约为 10 fps,我什至还没有进行任何图像处理.

Everything works as expected, my TextureView gets updated and framecallback gets called too Except ... the frame rate is about 10 fps and I haven't even do any image processing yet.

我尝试了许多 Camera2 API 设置,包括 SENSOR_FRAME_DURATION 和不同的 ImageFormat 和大小组合,但没有一个能提高帧速率.但是如果我只是从输出表面移除 ImageReader,那么预览很容易获得 30 fps!

I have experimented many Camera2 API settings include SENSOR_FRAME_DURATION and different ImageFormat and size combinations but none of them improve the frame rate. But if I just remove the ImageReader from output surfaces, then preview gets 30 fps easily!

所以我猜问题是通过添加 ImageReader 作为 Camera2 输出表面大大降低了预览帧速率.至少在我的情况下,那么解决方案是什么?

So I guess the problem is By adding ImageReader as Camera2 output surface decreased the preview frame rate drastically. At least on my case, so what is the solution?

我的解决方案是glReadPixel

我知道 glReadPixel 是邪恶的东西之一,因为它将字节从 GPU 复制到主内存,并导致 OpenGL 刷新绘制命令,因此为了性能我们最好避免使用它.但令人惊讶的是,glReadPixel 实际上非常快,并且提供比 ImageReader 的 YUV_420_888 输出更好的帧速率.

I know glReadPixel is one of the evil things because it copy bytes from GPU to main memory and also causing OpenGL to flush draw commands thus for sake of performance we'd better avoid using it. But its surprising that glReadPixel is actually pretty fast and providing much better frame rate then ImageReader's YUV_420_888 output.

除了减少内存开销之外,我还使用较小的帧缓冲区(如 360x640)进行了另一个绘制调用,而不是专用于特征检测的预览的 720p.

In addition to reduce the memory overhead I make another draw call with smaller frame buffer like 360x640 instead of preview's 720p dedicated for feature detection.

这篇关于Android camera2 输出到 ImageReader 格式 YUV_420_888 仍然很慢的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆