Swift Get Video Frame 使用 AVFoundation [英] Swift Get Video Frame use AVFoundation

查看:71
本文介绍了Swift Get Video Frame 使用 AVFoundation的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这是我的代码:我想获取我的视频并将帧数据获取到 SceneKit SCNSphere.

This is my code: I want get my video and get the frame data to an SceneKit SCNSphere.

    //NSString videoPath=NSBundle.mainBundle().l
        var videoURL=NSBundle.mainBundle().URLForResource("video", withExtension: "mp4")

        var assetOptions = [AVURLAssetPreferPreciseDurationAndTimingKey : 1]
        videoAsset=AVURLAsset(URL: videoURL, options: assetOptions)
        var error:NSError?

        var videoAssetReader=AVAssetReader(asset: videoAsset, error: &error)

        //var duration = CMTimeGetSeconds(videoAsset.duration)
        //println(videoAsset.duration)
       // println(duration)

        if error != nil
        {
            println(error)
        }

        var tracksArray=videoAsset?.tracksWithMediaType(AVMediaTypeVideo)
        var audio=videoAsset?.tracksWithMediaType(AVMediaTypeAudio)

        println(tracksArray?)
        var videotrack = tracksArray?[0] as AVAssetTrack

        //println(videotrack?.size)
        fps = videotrack.nominalFrameRate
        var videoSetting = [kCVPixelBufferPixelFormatTypeKey : kCVPixelFormatType_32RGBA]

        var videoTrackOutput=AVAssetReaderTrackOutput(track:videotrack as AVAssetTrack , outputSettings:videoSetting)

       // videoTrackOutput.outputSettings=

        if videoAssetReader.canAddOutput(videoTrackOutput)
        {
            println(videoTrackOutput)
            videoAssetReader.addOutput(videoTrackOutput)
            videoAssetReader.startReading()
        }

        var image=GeneralPreview()
        var uiImage=UIImage(CGImage: image)


        if videoAssetReader.status == AVAssetReaderStatus.Reading{

            var sampleBuffer  = videoTrackOutput.copyNextSampleBuffer()
            println(sampleBuffer)

            //var imageData: NSData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer?)
            //var image: UIImage = UIImage(data: imageData)
        }

我刚从 oc 学到这个.在我的代码中,我无法获取最后一个对象 sampleBuffer 它是 nil !我不知道为什么?我只是想知道我的mathod是否正确?希望得到帮助!

I just learn this from oc. In my code, I can not get the last object sampleBuffer It is nil ! I dnot know why? I just want to know if my mathod is right? hope for help!

推荐答案

尝试将 kCVPixelFormatType_32RGBA 更改为 kCVPixelFormatType_32BGRA.这就是我的诀窍.

Try changing kCVPixelFormatType_32RGBA to kCVPixelFormatType_32BGRA. That's what did the trick for me.

这篇关于Swift Get Video Frame 使用 AVFoundation的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆