iOS:从相机和运动数据同步帧 [英] iOS: Synchronizing frames from camera and motion data

查看:136
本文介绍了iOS:从相机和运动数据同步帧的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试从相机和相关的运动数据中捕获帧. 为了同步,我使用时间戳.视频和运动被写入文件,然后进行处理.在此过程中,我可以计算每个视频的运动帧偏移.

I'm trying to capture frames from camera and associated motion data. For synchronization I'm using timestamps. Video and motion is written to a file and then processed. In that process I can calculate motion-frames offset for every video.

将相同时间戳的运动数据和视频数据相互偏移从0.2秒到0.3秒的不同时间. 对于一个视频,此偏移量是恒定的,但因视频而异. 如果每次都具有相同的偏移量,我将能够减去一些校准值,但不是.

Turns out motion data and video data for same timestamp is offset from each other by different time from 0.2 sec up to 0.3 sec. This offset is constant for one video but varies from video to video. If it was same offset every time I would be able to subtract some calibrated value but it's not.

是否存在同步时间戳的好方法? 也许我没有正确记录它们? 有没有更好的方法将它们带入相同的参照系?

Is there a good way to synchronize timestamps? Maybe I'm not recording them correctly? Is there a better way to bring them to the same frame of reference?

CoreMotion返回相对于系统正常运行时间的时间戳,因此我添加了偏移以获得Unix时间:

CoreMotion returns timestamps relative to system uptime so I add offset to get unix time:

uptimeOffset = [[NSDate date] timeIntervalSince1970] - 
                   [NSProcessInfo processInfo].systemUptime;

CMDeviceMotionHandler blk =
    ^(CMDeviceMotion * _Nullable motion, NSError * _Nullable error){
        if(!error){
            motionTimestamp = motion.timestamp + uptimeOffset;
            ...
        }
    };

[motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXTrueNorthZVertical
                                                   toQueue:[NSOperationQueue currentQueue]
                                               withHandler:blk];

要获得高精度的帧时间戳,我正在使用AVCaptureVideoDataOutputSampleBufferDelegate.它也偏移到Unix时间:

To get frames timestamps with high precision I'm using AVCaptureVideoDataOutputSampleBufferDelegate. It is offset to unix time also:

-(void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    CMTime frameTime = CMSampleBufferGetOutputPresentationTimeStamp(sampleBuffer);

    if(firstFrame)
    {
        firstFrameTime = CMTimeMake(frameTime.value, frameTime.timescale);
        startOfRecording = [[NSDate date] timeIntervalSince1970];
    }

    CMTime presentationTime = CMTimeSubtract(frameTime, firstFrameTime);
    float seconds = CMTimeGetSeconds(presentationTime);

    frameTimestamp = seconds + startOfRecording;
    ...
}

推荐答案

我能找到的最佳解决方案是 要在记录的视频上运行特征跟踪器,请选择其中一个强大的特征,并绘制其沿X轴移动的速度,然后将该图与加速度计Y数据相关联.

The best solution I was able to find to this problem was to run a feature tracker over the recorded video, pick one of the strong features and plot the the speed of it's movement along say X axis and then correlate this plot to the accelerometer Y data.

当有2个相似的图沿横坐标彼此偏移时,有一种称为相关性,可以找到偏移量.

When there's 2 similar plots that are offset of each other along abscissa there's a technique called cross-correlation that allows to find the offset.

此方法有一个明显的缺点-它很慢,因为它需要一些视频处理.

There's an obvious drawback of this approach - it's slow as it requires some video processing.

这篇关于iOS:从相机和运动数据同步帧的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆