AVCaptureSession与UIImagePickerController的速度 [英] AVCaptureSession vs. UIImagePickerController speeds
问题描述
此刻,我正在UIImagePickerController
上使用自定义叠加层,调用takePicture()
来捕获图像.但是,调用委托方法didFinishPickingMediaWithInfo
需要花费几秒钟的时间.我听说过要使用AVCaptureSession
来控制相机.这会让我获得更快的拍照速度(类似于Snapchat的拍照速度)吗?还是我可以通过其他方式
At the moment I am using a custom overlay on a UIImagePickerController
, calling takePicture()
to capture images. However, it takes a good few seconds to call the delegate method didFinishPickingMediaWithInfo
. I've heard about using AVCaptureSession
for my control over the camera. Would this allow me to get faster picture-taking speeds (similar to that of Snapchat)? Or are there any other ways that I can
谢谢
编辑
我正在如下实现图像捕获.
I'm implementing my image capture as follows.
首先,我对覆盖视图使用名为CustomCameraOverlayView
的UIView子类来初始化UIImagePickerController,并将其呈现在模式视图中. CustomCameraOverlayView
的委托设置为UIImagePickerControllerDelegate
(自己),这样我就可以从CustomCameraOverlayView
调用takePicture
.
First I initialise a UIImagePickerController, using a UIView subclass called CustomCameraOverlayView
for the overlay view, and present it in a modal view. The delegate of the CustomCameraOverlayView
is set to the UIImagePickerControllerDelegate
(self) so that I can call takePicture
from the CustomCameraOverlayView
.
imagePickerController = UIImagePickerController()
if UIImagePickerController.isSourceTypeAvailable(.Camera) {
imagePickerController.sourceType = UIImagePickerControllerSourceType.Camera
imagePickerController.cameraDevice = UIImagePickerControllerCameraDevice.Rear
imagePickerController.editing = true
imagePickerController.delegate = self
imagePickerController.showsCameraControls = false
var customCameraOverlayView = CustomCameraOverlayView()
customCameraOverlayView.delegate = self
imagePickerController.cameraOverlayView = customCameraOverlayView
imagePickerController.cameraOverlayView!.frame = self.view.frame
var screenBounds: CGSize = UIScreen.mainScreen().bounds.size
var imageHeight = (screenBounds.width/3)*4
var scale = screenBounds.height / imageHeight
imagePickerController.cameraViewTransform = CGAffineTransformConcat(CGAffineTransformMakeScale(scale, scale), CGAffineTransformMakeTranslation(0, (screenBounds.height - imageHeight)/2))
self.presentViewController(imagePickerController, animated: true, completion: nil)
}
然后在CustomCameraOverlayView
内部,我执行了从IB中的拍照"按钮到我的代码的操作.它将消息发回给代表拍照:
Then inside the CustomCameraOverlayView
I have a action set up from the "take picture" button in IB to my code. It sends a message back to the delegate to take the picture:
@IBAction func takePicture(sender: UIButton) {
delegate!.imagePickerController.takePicture()
}
几秒钟后,委托方法被调用.是什么会减慢我的实施速度?
A few seconds later the delegate method is called. What could be slowing my implementation down?
推荐答案
两者之间没有性能差异(至少没有肉眼可见的差异).这只是在复杂性和控制之间的权衡.
There is no performance differences between the two(at least not which can be noticed with naked eye). Its just the trade off between complexity and control.
AVFoundation
很复杂,很难实现.但是它带来了对每个数据位的大量控制.对于涉及的任何类型的文件处理,AVFoundation
都是解决方法.
AVFoundation
is complex and a bit difficult to implement. But it comes with enormous amount of control over every bit of data. For any type of file processing involved, AVFoundation
is the way to go.
另一方面,UIImagePickerController
易于实现并且非常有用,如果您只想执行原始任务,例如记录和捕获图像.
On the other hand UIImagePickerController
is easy to implement and useful if you want to do only primitive tasks, such as recording and capturing images.
但是,如果其中任何一个似乎响应缓慢,则很可能是由于您的实施.您需要发布代码来处理该问题.因此,性能并不是在两者之间进行选择的好标准.
But if any of them seems to be responding slow, then most probably it is because of your implementation. You need to post the code to deal with that. So, performance won't be a good criteria to choose between the two.
这篇关于AVCaptureSession与UIImagePickerController的速度的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!