点击记录在使用藤像javacv [英] Tap to record like in vine using javacv

查看:303
本文介绍了点击记录在使用藤像javacv的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想实现一个水龙头录制功能,像藤蔓。在javacv规定处理记录样品(不接触到记录)的https: //github.com/bytedeco/javacv/blob/master/samples/RecordActivity.java 。我想这样来修改它previewFrame方法帧被加入到缓冲区,只有当用户有他的手指放在屏幕上。然后,这些帧被试图被组合成在STO precording方法最终视频。

I am trying to implement a tap to record feature like in vine. A sample for handling recording (not touch to record) provided in javacv is https://github.com/bytedeco/javacv/blob/master/samples/RecordActivity.java. I am trying to modify it so that in onPreviewFrame method frames are added to buffer only when user has his finger placed on screen. These frames are then tried to be combined into final video in stopRecording method.

问题是,如果我设置的时间戳如下code段给出(STO中precording法)

The issue is that if I set the timestamp as given in below code snippet (in stopRecording method)

if (t > recorder.getTimestamp()) 
{
    recorder.setTimestamp(t);
}

该行为是如下

案例1

如果我点击屏幕上的记录2秒,并采取手指离开屏幕为3秒钟,然后再次将手指回到屏幕录制另外4秒钟产生的视频是什么样子,

If I tap on screen to record for 2 seconds and take the finger away from screen for 3 seconds and then again place finger back on screen to record for another 4 seconds the resulting video is like,

有关12秒视频记录了内容。对于接下来的3秒(当手指离开屏幕收起时间)。视频只是表明,当手指放在屏幕上的最后录制的最后一帧。那么视频记录了明年4秒视频内容。所以似乎是,当手指从屏幕上消失处理录像的问题。

For 1st 2 seconds video has recorded content. For next 3 seconds (time when finger is put away from screen). video just shows the last frame recorded when finger was placed on screen last. Then the video has recorded video content for next 4 seconds. So there seems to be an issue in handling video recording when finger is removed from screen.

案例2

接下来,我删除了code设置时间戳记录仪STO precording方法(上面给出的code段)。

Next I removed the code setting time stamp to recorder(the code snippet given above) in stopRecording method.

现在将所得视频(用于在壳体1试图执行相同的步骤)不包含中间3秒(这是需要的)时手指从屏幕带走。但是视频播放以更快的速度。如此看来,我们需要设置时间戳让视频正常播放速度

Now the resulting video (for the same steps tried in case 1) does not contain the middle 3 seconds(which is what is required) when finger was taken away from screen. But video is playing at a faster rate. So it seems that we need to set time stamp so that video plays at normal rate.

我的活动的完整code如下。 (请注意,录像主要是从previewFrame和STO precording方法处理)

Full code of my activity is given below. (Please note that video recording is mainly handled from onPreviewFrame and stopRecording methods)

public class TouchToRecordActivity extends Activity implements OnClickListener, View.OnTouchListener {

private final static String CLASS_LABEL = "TouchToRecordActivity";
private final static String LOG_TAG = CLASS_LABEL;

private String ffmpeg_link = "/mnt/sdcard/stream.mp4";

long startTime = 0;
boolean recording = false;
boolean rec = false;

private FFmpegFrameRecorder recorder;

private boolean isPreviewOn = false;

private int sampleAudioRateInHz = 44100;
private int imageWidth = 640;
private int imageHeight = 480;
private int destWidth = 480;
private int frameRate = 30;

/* audio data getting thread */
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
volatile boolean runAudioThread = true;

/* video data getting thread */
private Camera cameraDevice;
private CameraView cameraView;

private Frame yuvImage = null;

/* layout setting */
private final int bg_screen_bx = 232;
private final int bg_screen_by = 128;
private final int bg_screen_width = 700;
private final int bg_screen_height = 500;
private final int bg_width = 1123;
private final int bg_height = 715;
private final int live_width = 640;
private final int live_height = 480;
private int screenWidth, screenHeight;
private Button btnRecorderControl;

/* The number of seconds in the continuous record loop (or 0 to disable loop). */
final int RECORD_LENGTH = 20;
Frame[] images;
long[] timestamps;
ShortBuffer[] samples;
int imagesIndex, samplesIndex;

long firstTime = 0;
long startPauseTime = 0;
long totalPauseTime = 0;
long pausedTime = 0;
long stopPauseTime = 0;
long totalTime = 0;

long totalRecordedTS = 0;

private TextView txtTimer;
private Handler mHandler = new Handler();

@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

    setContentView(R.layout.touch_main);

    initLayout();
}

@Override
protected void onDestroy() {
    super.onDestroy();

    recording = false;

    if (cameraView != null) {
        cameraView.stopPreview();
    }

    if (cameraDevice != null) {
        cameraDevice.stopPreview();
        cameraDevice.release();
        cameraDevice = null;
    }
}


private void initLayout() {

    /* get size of screen */
    Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
    screenWidth = display.getWidth();
    screenHeight = display.getHeight();
    RelativeLayout.LayoutParams layoutParam = null;
    LayoutInflater myInflate = null;
    myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
    RelativeLayout topLayout = new RelativeLayout(this);
    setContentView(topLayout);
    LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(R.layout.touch_main, null);
    layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
    topLayout.addView(preViewLayout, layoutParam);

    txtTimer = (TextView) preViewLayout.findViewById(R.id.txtTimer);

    /* add control button: start and stop */
    btnRecorderControl = (Button) findViewById(R.id.recorder_control);
    btnRecorderControl.setText("Start");
    btnRecorderControl.setOnClickListener(this);

    /* add camera view */
    int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
    int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
    int prev_rw, prev_rh;
    if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
        prev_rh = display_height_d;
        prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
    } else {
        prev_rw = display_width_d;
        prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
    }
    layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
    layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
    layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);

    cameraDevice = Camera.open();
    Log.i(LOG_TAG, "cameara open");
    cameraView = new CameraView(this, cameraDevice);
    topLayout.addView(cameraView, layoutParam);
    topLayout.setOnTouchListener(this);
    Log.i(LOG_TAG, "cameara preview start: OK");
}

//---------------------------------------
// initialize ffmpeg_recorder
//---------------------------------------
private void initRecorder() {

    Log.w(LOG_TAG, "init recorder");

    if (RECORD_LENGTH > 0) {
        imagesIndex = 0;
        images = new Frame[RECORD_LENGTH * frameRate];
        timestamps = new long[images.length];
        for (int i = 0; i < images.length; i++) {
            images[i] = new Frame(destWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
            timestamps[i] = -1;
        }
    } else if (yuvImage == null) {
        yuvImage = new Frame(destWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
        Log.i(LOG_TAG, "create yuvImage");
    }
    Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
    recorder = new FFmpegFrameRecorder(ffmpeg_link, destWidth, imageHeight, 1);
    recorder.setFormat("mp4");
    recorder.setVideoCodecName("libx264");
    recorder.setSampleRate(sampleAudioRateInHz);
    // Set in the surface changed method
    recorder.setFrameRate(frameRate);

    Log.i(LOG_TAG, "recorder initialize success");

    audioRecordRunnable = new AudioRecordRunnable();
    audioThread = new Thread(audioRecordRunnable);
    runAudioThread = true;
}

public void startRecording() {

    initRecorder();

    mHandler.removeCallbacks(mUpdateTimeTask);
    mHandler.postDelayed(mUpdateTimeTask, 100);

    try {
        recorder.start();
        startTime = System.currentTimeMillis();
        recording = true;
        audioThread.start();

    } catch (FFmpegFrameRecorder.Exception e) {
        e.printStackTrace();
    }
}

public void stopRecording() {

    runAudioThread = false;
    try {
        audioThread.join();
    } catch (InterruptedException e) {
        e.printStackTrace();
    }
    audioRecordRunnable = null;
    audioThread = null;

    if (recorder != null && recording) {
        if (RECORD_LENGTH > 0) {
            Log.v(LOG_TAG, "Writing frames");
            try {
                int firstIndex = imagesIndex % samples.length;
                int lastIndex = (imagesIndex - 1) % images.length;
                if (imagesIndex <= images.length) {
                    firstIndex = 0;
                    lastIndex = imagesIndex - 1;
                }
                if ((startTime = timestamps[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
                    startTime = 0;
                }
                if (lastIndex < firstIndex) {
                    lastIndex += images.length;
                }
                int videoCounter = 0;
                for (int i = firstIndex; i <= lastIndex; i++) {
                    if (timestamps[i] == -1) {
                        Log.v(LOG_TAG, "frame not recorded");
                    }
                    if (timestamps[i] != -1) {
                        long t = timestamps[i % timestamps.length] - startTime;
                        if (t >= 0) {

                            videoCounter++;

                            /*if (((i % images.length) != 0) && images[i % images.length] != images[(i % images.length) - 1]) {
                                if (t > recorder.getTimestamp()) {
                                    recorder.setTimestamp(t);
                                }*/
                                Log.v(LOG_TAG, "imageIndex=" + (i % images.length));
                                recorder.record(images[i % images.length]);
                        /*    }*/
                            Log.v(LOG_TAG, "videoCounter=" + videoCounter);
                        }
                    }
                }

                firstIndex = samplesIndex % samples.length;
                lastIndex = (samplesIndex - 1) % samples.length;
                if (samplesIndex <= samples.length) {
                    firstIndex = 0;
                    lastIndex = samplesIndex - 1;
                }
                if (lastIndex < firstIndex) {
                    lastIndex += samples.length;
                }
                for (int i = firstIndex; i <= lastIndex; i++) {
                    if (timestamps[i] != -1) {
                        recorder.recordSamples(samples[i % samples.length]);
                    }
                }
            } catch (FFmpegFrameRecorder.Exception e) {
                Log.v(LOG_TAG, e.getMessage());
                e.printStackTrace();
            }
        }

        recording = false;
        Log.v(LOG_TAG, "Finishing recording, calling stop and release on recorder");
        try {
            recorder.stop();
            recorder.release();
        } catch (FFmpegFrameRecorder.Exception e) {
            e.printStackTrace();
        }
        recorder = null;

    }
}

@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {

    if (keyCode == KeyEvent.KEYCODE_BACK) {
        if (recording) {
            stopRecording();
        }

        finish();

        return true;
    }

    return super.onKeyDown(keyCode, event);
}

@Override
public boolean onTouch(View view, MotionEvent motionEvent) {
    switch (motionEvent.getAction()) {
        case MotionEvent.ACTION_DOWN:
            Log.v(LOG_TAG, "ACTION_DOWN" + recording);

            if (!recording) {
                startRecording();
            } else {
                stopPauseTime = System.currentTimeMillis();
                totalPauseTime = stopPauseTime - startPauseTime - ((long) (1.0 / (double) frameRate) * 1000);
                pausedTime += totalPauseTime;
            }
            rec = true;
            setTotalVideoTime();
            btnRecorderControl.setText(getResources().getString(R.string.stop));
            break;
        case MotionEvent.ACTION_MOVE:
            rec = true;
            setTotalVideoTime();
            break;
        case MotionEvent.ACTION_UP:
            Log.v(LOG_TAG, "ACTION_UP");
            rec = false;
            startPauseTime = System.currentTimeMillis();
            break;
    }
    return true;
}

private Runnable mUpdateTimeTask = new Runnable() {
    public void run() {
        if (recording) {
            setTotalVideoTime();
        }
        mHandler.postDelayed(this, 500);
    }
};

private synchronized void setTotalVideoTime() {
    totalTime = System.currentTimeMillis() - firstTime - pausedTime - ((long) (1.0 / (double) frameRate) * 1000);
    if (totalTime > 0)
        txtTimer.setText(getRecordingTimeFromMillis(totalTime));
}

private String getRecordingTimeFromMillis(long millis) {
    String strRecordingTime = null;
    int seconds = (int) (millis / 1000);
    int minutes = seconds / 60;
    int hours = minutes / 60;

    if (hours >= 0 && hours < 10)
        strRecordingTime = "0" + hours + ":";
    else
        strRecordingTime = hours + ":";

    if (hours > 0)
        minutes = minutes % 60;

    if (minutes >= 0 && minutes < 10)
        strRecordingTime += "0" + minutes + ":";
    else
        strRecordingTime += minutes + ":";

    seconds = seconds % 60;

    if (seconds >= 0 && seconds < 10)
        strRecordingTime += "0" + seconds;
    else
        strRecordingTime += seconds;

    return strRecordingTime;

}


//---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {

    @Override
    public void run() {
        android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

        // Audio
        int bufferSize;
        ShortBuffer audioData;
        int bufferReadResult;

        bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
        audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

        if (RECORD_LENGTH > 0) {
            samplesIndex = 0;
            samples = new ShortBuffer[RECORD_LENGTH * sampleAudioRateInHz * 2 / bufferSize + 1];
            for (int i = 0; i < samples.length; i++) {
                samples[i] = ShortBuffer.allocate(bufferSize);
            }
        } else {
            audioData = ShortBuffer.allocate(bufferSize);
        }

        Log.d(LOG_TAG, "audioRecord.startRecording()");
        audioRecord.startRecording();

        /* ffmpeg_audio encoding loop */
        while (runAudioThread) {
            if (RECORD_LENGTH > 0) {
                audioData = samples[samplesIndex++ % samples.length];
                audioData.position(0).limit(0);
            }
            //Log.v(LOG_TAG,"recording? " + recording);
            bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
            audioData.limit(bufferReadResult);
            if (bufferReadResult > 0) {
                Log.v(LOG_TAG, "bufferReadResult: " + bufferReadResult);
                // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                // Why?  Good question...
                if (recording && rec) {
                    Log.v(LOG_TAG, "Recording audio");
                    if (RECORD_LENGTH <= 0) try {
                        recorder.recordSamples(audioData);
                        //Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                    } catch (FFmpegFrameRecorder.Exception e) {
                        Log.v(LOG_TAG, e.getMessage());
                        e.printStackTrace();
                    }
                }
            }
        }
        Log.v(LOG_TAG, "AudioThread Finished, release audioRecord");

        /* encoding finish, release recorder */
        if (audioRecord != null) {
            audioRecord.stop();
            audioRecord.release();
            audioRecord = null;
            Log.v(LOG_TAG, "audioRecord released");
        }
    }
}

//---------------------------------------------
// camera thread, gets and encodes video data
//---------------------------------------------
class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

    private SurfaceHolder mHolder;
    private Camera mCamera;

    public CameraView(Context context, Camera camera) {
        super(context);
        Log.w("camera", "camera view");
        mCamera = camera;
        mHolder = getHolder();
        mHolder.addCallback(CameraView.this);
        mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
        mCamera.setPreviewCallback(CameraView.this);
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        try {
            stopPreview();
            mCamera.setPreviewDisplay(holder);
        } catch (IOException exception) {
            mCamera.release();
            mCamera = null;
        }
    }

    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
        stopPreview();

        Camera.Parameters camParams = mCamera.getParameters();
        List<Camera.Size> sizes = camParams.getSupportedPreviewSizes();
        // Sort the list in ascending order
        Collections.sort(sizes, new Comparator<Camera.Size>() {

            public int compare(final Camera.Size a, final Camera.Size b) {
                return a.width * a.height - b.width * b.height;
            }
        });

        camParams.setPreviewSize(imageWidth, imageHeight);

        Log.v(LOG_TAG, "Setting imageWidth: " + imageWidth + " imageHeight: " + imageHeight + " frameRate: " + frameRate);

        camParams.setPreviewFrameRate(frameRate);
        Log.v(LOG_TAG, "Preview Framerate: " + camParams.getPreviewFrameRate());

        mCamera.setParameters(camParams);

        List<Camera.Size> videoSizes = mCamera.getParameters().getSupportedVideoSizes();

        // Set the holder (which might have changed) again
        try {
            mCamera.setPreviewDisplay(holder);
            mCamera.setPreviewCallback(CameraView.this);
            startPreview();
        } catch (Exception e) {
            Log.e(LOG_TAG, "Could not set preview display in surfaceChanged");
        }
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        try {
            mHolder.addCallback(null);
            mCamera.setPreviewCallback(null);
        } catch (RuntimeException e) {
            // The camera has probably just been released, ignore.
        }
    }

    public void startPreview() {
        if (!isPreviewOn && mCamera != null) {
            isPreviewOn = true;
            mCamera.startPreview();
        }
    }

    public void stopPreview() {
        if (isPreviewOn && mCamera != null) {
            isPreviewOn = false;
            mCamera.stopPreview();
        }
    }

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
        if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
            startTime = System.currentTimeMillis();
            return;
        }
        if (RECORD_LENGTH > 0) {
            int i = imagesIndex++ % images.length;
            Log.v(LOG_TAG, "recording:" + recording + "rec:" + rec);
            if (recording && rec) {
                yuvImage = images[i];
                timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
                totalRecordedTS++;
            } else {
                Log.v(LOG_TAG, "recording is paused");
                yuvImage = null;
                timestamps[i] = -1;
            }
        }

        /* get video data */
        if (yuvImage != null && recording && rec) {
            if (data.length != imageWidth * imageHeight) {
                Camera.Size sz = camera.getParameters().getPreviewSize();
                imageWidth = sz.width;
                imageHeight = sz.height;
                destWidth = imageHeight;
                Log.v(LOG_TAG, "data length:" + data.length);
            }

            ByteBuffer bb = (ByteBuffer) yuvImage.image[0].position(0); // resets the buffer
            int start = 2 * ((imageWidth - destWidth) / 4); // this must be even
            for (int row = 0; row < imageHeight * 3 / 2; row++) {
                bb.put(data, start, destWidth);
                start += imageWidth;
            }

        }
    }
}

@Override
public void onClick(View v) {
    if (!recording) {
        startRecording();
        Log.w(LOG_TAG, "Start Button Pushed");
        btnRecorderControl.setText("Stop");
    } else {
        // This will trigger the audio recording loop to stop and then set isRecorderStart = false;
        stopRecording();
        Log.w(LOG_TAG, "Stop Button Pushed");
        btnRecorderControl.setText("Start");
    }
}}

按亚历克斯·科恩的建议所做的更改

建议1 - 估计平均帧率

    public void stopRecording() {

   ..............................

                            if (((i % images.length) != 0) && images[i % images.length] != images[(i % images.length) - 1]) {
                                if (t > recorder.getTimestamp()) {
                                    t += 1000000 / frameRate;
                                    recorder.setTimestamp(t);
                                }

                                recorder.record(images[i % images.length]);
                            }
             ..........................................


}

变更进行了加T + = 1000000 /帧率;但这引起了视频冻结(如在上述情况下,1)部分时,手指从屏幕放置。

Change made was adding t += 1000000 / frameRate; But this caused the video to freeze (as in case 1 described above) in portions when finger was placed away from screen.

建议2 - 修改在previewFrame()

long[] timestampsForRecorder;
private void initRecorder() {

    Log.w(LOG_TAG, "init recorder");

    if (RECORD_LENGTH > 0) {
       .......................................................
        timestampsForRecorder = new long[images.length];
        for (int i = 0; i < images.length; i++) {
            images[i] = new Frame(destWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
            timestamps[i] = -1;
            timestampsForRecorder[i] = -1;
        }
    } else if (yuvImage == null) {
        yuvImage = new Frame(destWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
        Log.i(LOG_TAG, "create yuvImage");
    }
    ...................................................
}

@Override
public void onPreviewFrame(byte[] data, Camera camera) {
        if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
            startTime = SystemClock.elapsedRealtime();
            return;
        }
        if (RECORD_LENGTH > 0) {
            int i = imagesIndex++ % images.length;
            Log.v(LOG_TAG, "recording:" + recording + "rec:" + rec);
            if (recording && rec) {
                yuvImage = images[i];
                long thisFrameTime = SystemClock.elapsedRealtime();
                timestamps[i] = thisFrameTime;
                long lastFrameTime = timestamps[(int) (imagesIndex == 0 ? startTime : ((imagesIndex-1) % images.length))];
                Log.v(LOG_TAG, "lastFrameTime:" + lastFrameTime+",stopPauseTime:" + stopPauseTime);
                if (lastFrameTime > stopPauseTime) {
                    timestampsForRecorder[i] = 1000 * (thisFrameTime - Math.max(stopPauseTime, lastFrameTime));
                }
            }
        }

       .....................................................
    }

public void stopRecording() {

    .......................................................

    if (recorder != null && recording) {
        if (RECORD_LENGTH > 0) {
            Log.v(LOG_TAG, "Writing frames");
            try {
                int firstIndex = imagesIndex % samples.length;
                int lastIndex = (imagesIndex - 1) % images.length;
                if (imagesIndex <= images.length) {
                    firstIndex = 0;
                    lastIndex = imagesIndex - 1;
                }
                if ((startTime = timestampsForRecorder[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
                    startTime = 0;
                }
                if (lastIndex < firstIndex) {
                    lastIndex += images.length;
                }
                for (int i = firstIndex; i <= lastIndex; i++) {

                    if (timestampsForRecorder[i] != -1) {
                        long t = timestampsForRecorder[i % timestampsForRecorder.length] - startTime;
                        if (t >= 0) {

                            if (((i % images.length) != 0) && images[i % images.length] != images[(i % images.length) - 1]) {
                                if (t > recorder.getTimestamp()) {
                                    recorder.setTimestamp(t);
                                }
                                Log.v(LOG_TAG, "imageIndex=" + (i % images.length));
                                recorder.record(images[i % images.length]);
                            }
                        }
                    }
                }
                .............................................
            } catch (FFmpegFrameRecorder.Exception e) {
               .................................
            }
        }

        ...........................................

    }
}

使用这是具有上述情形2问题的视频录制。即,它被打以更快的速度

The video recorded using this was having the issue in case 2 mentioned above. ie,It was playing at a faster rate

推荐答案

最简单的(但IM precise)解决方案是估计的平均帧速率,并使用 T + = 1000000 / average_fps; recorder.setTimestamp(T); ,而不是看实际的时间戳

The easy (but imprecise) solution would be to estimate the average frame rate, and use t += 1000000/average_fps; recorder.setTimestamp(t); instead of looking at the actual timestamps.

要更准确,可以改变上previewFrame()如下:

To be more accurate, you can change onPreviewFrame() as follows:

long thisFrameTime = SystemClock.elapsedRealtime();
timestamps[i] = thisFrameTime;
long lastFrameTime = timestamps[imagesIndex < 2 ? startTime : (imagesIndex-2) % images.length)];
if (lastFrameTime > stopPauseTime) {
    timestampsForRecorder[i] = 1000 * (thisFrameTime - Math.max(stopPauseTime, lastFrameTime));
}

您可以养活第二阵列, timestampsForRecorder 后,直接记录。

You can feed the second array, timestampsForRecorder, directly to the recorder.

需要注意的是它的安全使用 SystemClock.elapsedRealtime() 无处不在:

Note that it's safer to use SystemClock.elapsedRealtime() everywhere:

这时钟确保单调,并继续剔甚至当CPU处于省电模式,所以是通用间隔时间的建议基础。

This clock is guaranteed to be monotonic, and continues to tick even when the CPU is in power saving modes, so is the recommend basis for general purpose interval timing.

这篇关于点击记录在使用藤像javacv的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆