使用CGBitmapContext进行绘制太慢了 [英] Drawrect with CGBitmapContext is too slow

查看:91
本文介绍了使用CGBitmapContext进行绘制太慢了的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

所以我在这个过程中有一个基本的绘图应用程序,允许我绘制线条。我绘制到屏幕外的位图,然后在 drawRect 中显示图像。它工作但速度太慢,用你的手指画完后再更新大约半秒钟。我从本教程中获取了代码并对其进行了调整, http:// www。 youtube.com/watch?v=UfWeMIL-Nu8&feature=relmfu ,正如你在评论中看到的那样,人们也说它太慢了,但这家伙没有回复。

So I've got a basic drawing app in the process that allows me to draw lines. I draw to an off screen bitmap then present the image in drawRect. It works but its way too slow, updating about half a second after you've drawn it with your finger. I took the code and adapted it from this tutorial, http://www.youtube.com/watch?v=UfWeMIL-Nu8&feature=relmfu , as you can see in the comments people are also saying its too slow but the guy hasn't responded.

那么我怎样才能加快速度呢?或者有更好的方法吗?任何指针都将被赞赏。

So how can I speed it up? or is there a better way to do it? any pointers will be appreciated.

继承我的 DrawView.m 中的代码。

-(id)initWithCoder:(NSCoder *)aDecoder {
     if ((self=[super initWithCoder:aDecoder])) {
         [self setUpBuffer];
     }

     return self;
}

-(void)setUpBuffer {
     CGContextRelease(offscreenBuffer);

     CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

     offscreenBuffer = CGBitmapContextCreate(NULL, self.bounds.size.width, self.bounds.size.height, 8, self.bounds.size.width*4, colorSpace, kCGImageAlphaPremultipliedLast);
     CGColorSpaceRelease(colorSpace);

     CGContextTranslateCTM(offscreenBuffer, 0, self.bounds.size.height);
     CGContextScaleCTM(offscreenBuffer, 1.0, -1.0);
}


-(void)drawToBuffer:(CGPoint)coordA :(CGPoint)coordB :(UIColor *)penColor :(int)thickness {

     CGContextBeginPath(offscreenBuffer);
     CGContextMoveToPoint(offscreenBuffer, coordA.x,coordA.y);
     CGContextAddLineToPoint(offscreenBuffer, coordB.x,coordB.y);
     CGContextSetLineWidth(offscreenBuffer, thickness);
     CGContextSetLineCap(offscreenBuffer, kCGLineCapRound);
     CGContextSetStrokeColorWithColor(offscreenBuffer, [penColor CGColor]);
     CGContextStrokePath(offscreenBuffer);

}

- (void)drawRect:(CGRect)rect {
    CGImageRef cgImage = CGBitmapContextCreateImage(offscreenBuffer);
    UIImage *image =[[UIImage alloc] initWithCGImage:cgImage];
    CGImageRelease(cgImage);
    [image drawInRect:self.bounds];

}

在模拟器上完美运行但不是设备,我想这是与处理器速度有关。

Works perfectly on the simulator but not device, I imagine that's something to do with processor speed.

我正在使用ARC。

推荐答案

我试图修复你的代码,但是因为你似乎只发布了一半的代码我无法使它工作(复制+粘贴代码会导致很多错误,更不用说启动性能调优了。)

I tried to fix your code, however as you only seem to have posted half of it I couldn't get it working (Copy+pasting code results in lots of errors, let alone start performance tuning it).

但是,您可以使用一些提示来大幅提高性能。

However there are some tips you can use to VASTLY improve performance.

第一个,也许最明显的是 - setNeedsDisplayInRect:而不是-setNeedsDisplay。这意味着它只会重绘已更改的小矩形。对于具有1024 * 768 * 4像素的iPad 3来说,这是一项很多工作。将每个帧降低到大约20 * 20或更低将大量提高性能。

The first, and probably most noticeably, is -setNeedsDisplayInRect: rather then -setNeedsDisplay. This will mean that it only redraws the little rect that changed. For an iPad 3 with 1024*768*4 pixels that is a lot of work. Reducing that down to about 20*20 or less for each frame will massively improve performance.

CGRect rect;
rect.origin.x = minimum(coordA.x, coordB.x) - (thickness * 0.5);
rect.size.width = (maximum(coordA.x, coordB.x) + (thickness * 0.5)) - rect.origin.x;
rect.origin.y = minimum(coordA.y, coordB.y) - (thickness * 0.5);
rect.size.height = (maximum(coordA.y, coordB.y) + (thickness * 0.5)) - rect.origin.y;
[self setNeedsDisplayInRect:rect];

你可以做的另一个重大改进是只为当前触摸绘制CGPath(你做的) 。但是,然后您在绘制rect中绘制已保存/缓存的图像。所以,再次,每帧重绘一次。更好的方法是让绘图视图透明,然后使用后面的UIImageView。 UIImageView是在iOS上显示图像的最佳方式。

Another big improvement you could make is to only draw the CGPath for this current touch (which you do). However you then draw that saved/cached image in the draw rect. So, again, it is redrawn each frame. A better approach is to have the draw view being transparent and then to use a UIImageView behind that. UIImageView is the best way to display images on iOS.

- DrawView (1 finger)
   -drawRect:
- BackgroundView (the image of the old touches)
   -self.image

绘制视图本身只会绘制当前触摸只有每次更改的部分。当用户抬起手指时,您可以将其缓存到UIImage,在当前背景/缓存UIImageView的图像上绘制,并将imageView.image设置为新图像。

The draw view would itself then only ever draw the current touch only the part that changes each time. When the user lifts their finger you can cache that to a UIImage, draw that over the current background/cache UIImageView's image and set the imageView.image to the new image.

组合图像时的最后一点涉及将2个全屏图像绘制到屏幕外CGContext中,因此如果在主线程上完成则会导致延迟,而这应该在后台线程中完成,然后将结果推回到主线程。

That final bit when combining the images involves drawing 2 full screen images into an off screen CGContext and so will cause lag if done on the main thread, instead this should be done in a background thread and then the result pushed back to the main thread.

* touch starts *
- DrawView : draw current touch
* touch ends *
- 'background thread' : combine backgroundView.image and DrawView.drawRect
    * thread finished *
    send resulting UIImage to main queue and set backgroundView.image to it;
    Clear DrawView's current path that is now in the cache;

所有这些组合可以制作一个非常流畅的60fps绘图应用程序。但是,视图不会像我们希望的那样快速更新,因此在更快地移动图形时绘图看起来像是锯齿状。这可以通过使用UIBezierPath而不是CGPath来改进。

All of this combined can make a very smooth 60fps drawing app. However, views are not updated as quickly as we'd like so the drawing when moving the figure faster looks jagged. This can be improved by using UIBezierPath's instead of CGPaths.

CGPoint lastPoint = [touch previousLocationInView:self];
CGPoint mid = midPoint(currentPoint, lastPoint);
-[UIBezierPath addQuadCurveToPoint:mid controlPoint:lastPoint];

这篇关于使用CGBitmapContext进行绘制太慢了的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆