如何知道如果在XCode(swift或Objective C)中触及.png的唯一可见区域 [英] How to know that if the only visible area of a .png is touched in XCode (swift or objective C)
问题描述
我已经在Xcode中将.png图像导入 UIImageView
,我想要做的是触摸图像时,它将被隐藏。
但我的问题是png图像包含透明部分,当我触摸透明部分时,操作继续进行。我希望只有在触摸图像的可见部分时才会继续操作。请告诉我如何解决问题。
我创建了一个自定义UIButton子类,其行为完全如您所描述的那样,看看:
$ g $
它是用Swift编写的,但它很容易转换为Objective-c。
方法是从触摸点获取像素数据并访问RGBA值,在这种情况下,我们读取A(alpha)并检查它是否高于我们的阈值。
查看一些代码:
func alphaFromPoint(point:CGPoint) - > CGFloat {
var pixel:[UInt8] = [0,0,0,0]
let colorSpace = CGColorSpaceCreateDeviceRGB();
let alphaInfo = CGBitmapInfo(CGImageAlphaInfo.PremultipliedLast.rawValue)
let context = CGBitmapContextCreate(& pixel,1,1,8,4,colorSpace,alphaInfo)
CGContextTranslateCTM( context,-point.x,-point.y);
self.layer.renderInContext(context)
let floatAlpha = CGFloat(pixel [3])
return floatAlpha
}
您可以采用 floatAlpha
值并将其与可接受的值进行比较alpha的值:
覆盖func pointInside(点:CGPoint,withEvent事件:UIEvent?) - > Bool {
返回self.alphaFromPoint(点)> = 100
}
I have imported a .png image into UIImageView
in Xcode and what I want to make is when the image is touched, it will be hidden.
But my problem is that the png image contains transparent parts and when I touch on the transparent parts, the action goes on. I want the action to go on only when the visible part of the image is touched. Please tell me how to solve the problem.
I have created a custom UIButton subclass that behaves exactly as you describe, have a look: https://github.com/spagosx/iOS-Shaped-Button-Swift
It's written in Swift, but it's easily convertible to Objective-c.
The approach is to get the pixel data from the touch point and to access the RGBA values, in this case we read A (alpha) and check if it is higher than our threshold.
Looking at a bit of code:
func alphaFromPoint(point: CGPoint) -> CGFloat {
var pixel: [UInt8] = [0, 0, 0, 0]
let colorSpace = CGColorSpaceCreateDeviceRGB();
let alphaInfo = CGBitmapInfo(CGImageAlphaInfo.PremultipliedLast.rawValue)
let context = CGBitmapContextCreate(&pixel, 1, 1, 8, 4, colorSpace, alphaInfo)
CGContextTranslateCTM(context, -point.x, -point.y);
self.layer.renderInContext(context)
let floatAlpha = CGFloat(pixel[3])
return floatAlpha
}
You can than take the floatAlpha
value and compare it with your acceptable value of alpha:
override func pointInside(point: CGPoint, withEvent event: UIEvent?) -> Bool {
return self.alphaFromPoint(point) >= 100
}
这篇关于如何知道如果在XCode(swift或Objective C)中触及.png的唯一可见区域的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!