在Android上无法预览的Camera2录像:mp4输出文件无法完全播放 [英] Camera2 video recording without preview on Android: mp4 output file not fully playable

查看:650
本文介绍了在Android上无法预览的Camera2录像:mp4输出文件无法完全播放的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在我的三星Galaxy S6(支持1920x1080约30 fps)下从后置摄像头(面对脸部)录制视频.如果不需要,我不需要使用任何表面进行预览,因为这只是在后台发生.

I am trying to record video from the back camera (the one that faces the face) on my Samsung Galaxy S6 (which supports 1920x1080 at about 30 fps). I do not want to have to use any surface for previewing if I do not have to as this is to just happen in the background.

我似乎可以使用它,但是输出文件无法以实际上正确的方式播放.在我的Windows 10 PC上,Windows Media Player将显示第一帧,然后播放音频,VLC将不显示任何帧.在我的手机上,录制的文件可以播放,但不能完全播放.它将使第一帧保持5-8秒,然后在最后,剩余时间变为0,显示的总时间改变,然后开始播放实际的视频帧.在我的Mac(10.9.5)上,Quicktime不会显示视频(尽管没有错误),但是Google Picasa可以完美播放.我想在PC上尝试使用Picasa,以查看它是否可以在其中运行,但由于它已经处于日落状态,因此我无法再下载Google Picasa.

I seem to have it working, but the output files are not playable in a way that actually is correct. On my Windows 10 PC, Windows Media Player will show the first frame and then play the audio, VLC will not show any of the frames. On my phone, the recorded file is playable but not totally. It will hold the first frame for 5-8 seconds and then at the very end, the time left goes to 0, the total time displayed changes and then the actual video frames begin to play. On my Mac (10.9.5) Quicktime will not show the video (no errors though), yet Google Picasa can play it perfectly. I wanted to try Picasa on my PC to see if it worked there, but I could not download Google Picasa anymore as it has been sunset.

我尝试为发现的Windows安装一个编解码器包,但这并不能解决任何问题. MediaInfo v0.7.85报告了有关文件的信息:

I tried installing a codec pack for Windows that I found, but that did not resolve anything. MediaInfo v0.7.85 reports this about the file:


General
Complete name               : C:\...\1465655479915.mp4
Format                      : MPEG-4
Format profile              : Base Media / Version 2
Codec ID                    : mp42 (isom/mp42)
File size                   : 32.2 MiB
Duration                    : 15s 744ms
Overall bit rate            : 17.1 Mbps
Encoded date                : UTC 2016-06-11 14:31:50
Tagged date                 : UTC 2016-06-11 14:31:50
com.android.version         : 6.0.1

Video
ID                          : 1
Format                      : AVC
Format/Info                 : Advanced Video Codec
Format profile              : High@L4
Format settings, CABAC      : Yes
Format settings, ReFrames   : 1 frame
Format settings, GOP        : M=1, N=30
Codec ID                    : avc1
Codec ID/Info               : Advanced Video Coding
Duration                    : 15s 627ms
Bit rate                    : 16.2 Mbps
Width                       : 1 920 pixels
Height                      : 1 080 pixels
Display aspect ratio        : 16:9
Frame rate mode             : Variable
Frame rate                  : 0.000 (0/1000) fps
Minimum frame rate          : 0.000 fps
Maximum frame rate          : 30.540 fps
Color space                 : YUV
Chroma subsampling          : 4:2:0
Bit depth                   : 8 bits
Scan type                   : Progressive
Stream size                 : 0.00 Byte (0%)
Source stream size          : 31.7 MiB (98%)
Title                       : VideoHandle
Language                    : English
Encoded date                : UTC 2016-06-11 14:31:50
Tagged date                 : UTC 2016-06-11 14:31:50
mdhd_Duration               : 15627

Audio
ID                          : 2
Format                      : AAC
Format/Info                 : Advanced Audio Codec
Format profile              : LC
Codec ID                    : 40
Duration                    : 15s 744ms
Bit rate mode               : Constant
Bit rate                    : 256 Kbps
Channel(s)                  : 2 channels
Channel positions           : Front: L R
Sampling rate               : 48.0 KHz
Frame rate                  : 46.875 fps (1024 spf)
Compression mode            : Lossy
Stream size                 : 492 KiB (1%)
Title                       : SoundHandle
Language                    : English
Encoded date                : UTC 2016-06-11 14:31:50
Tagged date                 : UTC 2016-06-11 14:31:50

我用来创建此代码的代码是:

The code that I am using to create this is:

package invisiblevideorecorder;

import android.content.Context;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.media.CamcorderProfile;
import android.media.MediaRecorder;
import android.os.Environment;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;
import android.view.Surface;

import java.io.File;
import java.io.IOException;
import java.util.Arrays;

/**
 * @author Mark
 * @since 6/10/2016
 */
public class InvisibleVideoRecorder {
    private static final String TAG = "InvisibleVideoRecorder";
    private final CameraCaptureSessionStateCallback cameraCaptureSessionStateCallback = new CameraCaptureSessionStateCallback();
    private final CameraDeviceStateCallback cameraDeviceStateCallback = new CameraDeviceStateCallback();
    private MediaRecorder mediaRecorder;
    private CameraManager cameraManager;
    private Context context;

    private CameraDevice cameraDevice;

    private HandlerThread handlerThread;
    private Handler handler;

    public InvisibleVideoRecorder(Context context) {
        this.context = context;
        handlerThread = new HandlerThread("camera");
        handlerThread.start();
        handler = new Handler(handlerThread.getLooper());

        try {
            mediaRecorder = new MediaRecorder();

            mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
            mediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);

            final String filename = context.getExternalFilesDir(Environment.DIRECTORY_MOVIES).getAbsolutePath() + File.separator + System.currentTimeMillis() + ".mp4";
            mediaRecorder.setOutputFile(filename);
            Log.d(TAG, "start: " + filename);

            // by using the profile, I don't think I need to do any of these manually:
//            mediaRecorder.setVideoEncodingBitRate(16000000);
//            mediaRecorder.setVideoFrameRate(30);
//            mediaRecorder.setCaptureRate(30);
//            mediaRecorder.setVideoSize(1920, 1080);
//            mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
//            mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);

//            Log.d(TAG, "start: 1 " + CamcorderProfile.hasProfile(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_1080P));
            // true
//            Log.d(TAG, "start: 2 " + CamcorderProfile.hasProfile(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_HIGH_SPEED_1080P));
            // false
//            Log.d(TAG, "start: 3 " + CamcorderProfile.hasProfile(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_HIGH));
            // true

            CamcorderProfile profile = CamcorderProfile.get(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_1080P);
            Log.d(TAG, "start: profile " + ToString.inspect(profile));
//          start: 0 android.media.CamcorderProfile@114016694 {
//                audioBitRate: 256000
//                audioChannels: 2
//                audioCodec: 3
//                audioSampleRate: 48000
//                duration: 30
//                fileFormat: 2
//                quality: 6
//                videoBitRate: 17000000
//                videoCodec: 2
//                videoFrameHeight: 1080
//                videoFrameRate: 30
//                videoFrameWidth: 1920
//            }
            mediaRecorder.setOrientationHint(0);
            mediaRecorder.setProfile(profile);
            mediaRecorder.prepare();
        } catch (IOException e) {
            Log.d(TAG, "start: exception" + e.getMessage());
        }

    }

    public void start() {
        Log.d(TAG, "start: ");

        cameraManager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
        try {
            cameraManager.openCamera(String.valueOf(CameraMetadata.LENS_FACING_BACK), cameraDeviceStateCallback, handler);
        } catch (CameraAccessException | SecurityException e) {
            Log.d(TAG, "start: exception " + e.getMessage());
        }

    }

    public void stop() {
        Log.d(TAG, "stop: ");
        mediaRecorder.stop();
        mediaRecorder.reset();
        mediaRecorder.release();
        cameraDevice.close();
        try {
            handlerThread.join();
        } catch (InterruptedException e) {

        }
    }

    private class CameraCaptureSessionStateCallback extends CameraCaptureSession.StateCallback {
        private final static String TAG = "CamCaptSessionStCb";

        @Override
        public void onActive(CameraCaptureSession session) {
            Log.d(TAG, "onActive: ");
            super.onActive(session);
        }

        @Override
        public void onClosed(CameraCaptureSession session) {
            Log.d(TAG, "onClosed: ");
            super.onClosed(session);
        }

        @Override
        public void onConfigured(CameraCaptureSession session) {
            Log.d(TAG, "onConfigured: ");
        }

        @Override
        public void onConfigureFailed(CameraCaptureSession session) {
            Log.d(TAG, "onConfigureFailed: ");
        }

        @Override
        public void onReady(CameraCaptureSession session) {
            Log.d(TAG, "onReady: ");
            super.onReady(session);
            try {
                CaptureRequest.Builder builder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
                builder.addTarget(mediaRecorder.getSurface());
                CaptureRequest request = builder.build();
                session.setRepeatingRequest(request, null, handler);
                mediaRecorder.start();
            } catch (CameraAccessException e) {
                Log.d(TAG, "onConfigured: " + e.getMessage());

            }
        }

        @Override
        public void onSurfacePrepared(CameraCaptureSession session, Surface surface) {
            Log.d(TAG, "onSurfacePrepared: ");
            super.onSurfacePrepared(session, surface);
        }
    }

    private class CameraDeviceStateCallback extends CameraDevice.StateCallback {
        private final static String TAG = "CamDeviceStateCb";

        @Override
        public void onClosed(CameraDevice camera) {
            Log.d(TAG, "onClosed: ");
            super.onClosed(camera);
        }

        @Override
        public void onDisconnected(CameraDevice camera) {
            Log.d(TAG, "onDisconnected: ");
        }

        @Override
        public void onError(CameraDevice camera, int error) {
            Log.d(TAG, "onError: ");
        }

        @Override
        public void onOpened(CameraDevice camera) {
            Log.d(TAG, "onOpened: ");
            cameraDevice = camera;
            try {
                camera.createCaptureSession(Arrays.asList(mediaRecorder.getSurface()), cameraCaptureSessionStateCallback, handler);
            } catch (CameraAccessException e) {
                Log.d(TAG, "onOpened: " + e.getMessage());
            }
        }
    }

}

我遵循了Android源代码(测试和应用程序)以及我在github上找到的几个示例,以解决这个问题,因为camera2 API的文档还不够好.

I followed Android source (test and application) code, as well as a couple of examples I found on github, to get this figured out as the camera2 API is not very well documented yet.

有明显的错误提示吗?或者,我是否只是在Mac上缺少可用于Quicktime的编解码器,而在PC上却缺少Windows Media Player和VLC的编解码器?我还没有尝试在Linux上播放文件,所以我不知道在那里会发生什么.哦,如果我将mp4文件上传到photos.google.com,则它们也可以在此处完全正常播放.

Is there something obvious that I am doing incorrectly? Or, am I just missing codecs on my Mac for Quicktime to use and on my PC for Windows Media Player and VLC to use? I haven't tried playing the files on Linux yet, so I don't know what happens there yet. Oh, and if I upload the mp4 files to photos.google.com, they are also fully correctly playable there.

谢谢! 标记

推荐答案

我们的团队在开发基于Camera2 API的插件时遇到了类似的问题,但它只影响了Samsung Galaxy S7(我们还有一个S6用于没有表现出这种行为的测试).

My team encountered a similar problem when we were developing a plugin based on the Camera2 API, but it only affected a Samsung Galaxy S7 (we also have an S6 for testing that didn't exhibit this behaviour).

该问题似乎是由三星相机固件中的错误引起的,并在设备退出深度睡眠(Android 6.0棉花糖中的超低功耗模式)时触发.从深度睡眠恢复后,使用Camera2 MediaRecorder捕获和编码的任何视频的第一帧的帧持续时间都非常长-有时长于或超过视频本身的总持续时间.

The issue appeared to be caused by a bug in Samsung's camera firmware and was triggered when the device came out of Deep Sleep (the ultra-low power mode in Android 6.0 Marshmallow). After resuming from Deep Sleep, the first frame of any video captured and encoded using the Camera2 MediaRecorder has an extraordinarily long frame duration - sometimes as long as or longer than the total duration of the video itself.

因此,在播放时,在持续播放音频的同时,第一帧显示的时间很长.一旦第一帧显示完毕,其余的帧就会照常播放.

Consequently, when playing back, the first frame is displayed for this long duration while audio continues to play. Once the first frame has finished displaying, the rest of the frames play back as normal.

我们发现了其他有类似问题的人在GitHub上讨论该问题

We found other people with a similar problem discussing the issue on GitHub

该问题是某些运行棉花糖的设备上的深度睡眠问题.似乎与CPU有关,因为Verizon上的S7没有问题,但是AT& T上的S7确实存在问题.更新到棉花糖后,我已经在S6 Verizon手机上看到了这一点.

The issue is a deep sleep problem on some devices running Marshmallow. It appears to be CPU related as an S7 on Verizon doesn't have the issue, but an S7 on AT&T does have the issue. I've seen this on an S6 Verizon phone when it updated to Marshmallow.

为了进行复制,请在连接到USB时重新启动设备.运行示例.一切都应该没事.然后,断开设备的连接,使其进入深度睡眠状态(关闭屏幕,保持5分钟没有动静),然后重试.一旦设备进入深度睡眠状态,该问题就会出现.

In order to replicate, reboot a device while connected to USB. Run the sample. All should be ok. Then, disconnect the device, let it go into deep sleep (screen off, no movement for 5? minutes), and try again. The issue will appear once the device has gone into deep sleep.

我们最终使用了 cybaker提出的解决方法;也就是说,在创建视频文件时,请检查视频第一帧的持续时间.如果看起来不正确,请使用合理的帧时长重新编码视频:

We ended up using cybaker's proposed workaround; that is, when the video file is created, inspect the duration of the first frame of the video. If it appears to be incorrect, re-encode the video with sensible frame durations:

DataSource channel = new FileDataSourceImpl(rawFile);
IsoFile isoFile = new IsoFile(channel);

List<TrackBox> trackBoxes = isoFile.getMovieBox().getBoxes(TrackBox.class);
boolean sampleError = false;
for (TrackBox trackBox : trackBoxes) {
    TimeToSampleBox.Entry firstEntry = trackBox.getMediaBox().getMediaInformationBox().getSampleTableBox().getTimeToSampleBox().getEntries().get(0);

    // Detect if first sample is a problem and fix it in isoFile
    // This is a hack. The audio deltas are 1024 for my files, and video deltas about 3000
    // 10000 seems sufficient since for 30 fps the normal delta is about 3000
    if(firstEntry.getDelta() > 10000) {
        sampleError = true;
        firstEntry.setDelta(3000);
    }
}

if(sampleError) {
    Movie movie = new Movie();
    for (TrackBox trackBox : trackBoxes) {
            movie.addTrack(new Mp4TrackImpl(channel.toString() + "[" + trackBox.getTrackHeaderBox().getTrackId() + "]" , trackBox));
    }
    movie.setMatrix(isoFile.getMovieBox().getMovieHeaderBox().getMatrix());
    Container out = new DefaultMp4Builder().build(movie);

    //delete file first!
    FileChannel fc = new RandomAccessFile(rawFile.getName(), "rw").getChannel();
    out.writeContainer(fc);
    fc.close();
    Log.d(TAG, "Finished correcting raw video");
}

希望这会为您指明正确的方向!

Hope this points you in the right direction!

这篇关于在Android上无法预览的Camera2录像:mp4输出文件无法完全播放的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆