OpenCv没有为Xcode中的Apple Pencil位置返回灰度值 [英] OpenCv isn't returning grey value for the Apple Pencil location in Xcode
问题描述
我有这个应用程序,该程序可以拍摄灰度图像(在OpenCVWrapper下),并将Apple Pencil触摸的特定部分的亮度转换成声音.较暗的区域不发出任何声音,中等区域的声音中等,而明亮区域的声音很高.
I have this application that takes an image (under OpenCVWrapper) that is grayscale and turns the brightness of that specific section that's touched by the Apple Pencil into sound. The dark areas don't make any sound, the medium areas make a medium pitch, and the bright areas make a high pitch.
但是由于某种原因,代码可以在图像的某些区域完美地工作,而在其他方面则几乎相反.
But for some reason the code works in some areas of the image flawlessly and in other it's almost reversed.
这部分叫做OpenCVWrapper.h,它是我的Xcodeproject中的一个文件
This part is called OpenCVWrapper.h it's a file in my Xcodeproject
#import <Foundation/Foundation.h>
NS_ASSUME_NONNULL_BEGIN
@interface OpenCVWrapper : NSObject
+ (UInt16) getGrayVal:(int)i :(int)j;
@end
NS_ASSUME_NONNULL_END
然后将此文件另存为OpenCVWrapper.mm
Then this file saved as OpenCVWrapper.mm
#import "OpenCVWrapper.h"
#import <opencv2/opencv.hpp>
@implementation OpenCVWrapper
+ (UInt16) getGrayVal:(int)i :(int)j{
cv::Mat grayMat;
NSString *path = [[NSBundle mainBundle] pathForResource:@"gausshighres" ofType:@"png"];
const char * cpath = [path cStringUsingEncoding:NSUTF8StringEncoding];
grayMat = cv::imread(cpath,cv::IMREAD_ANYDEPTH);
return grayMat.at<UInt16>(i, j);
}
@end
在这行中转到我的视图控制器
which goes to my view controller in this line
let grayVal = Double(OpenCVWrapper.getGrayVal(Int32(round(y)),Int32(round(x))))
x和y是笔的位置
let x = pencilStrokeRecognizer.touchLocation.x
let y = pencilStrokeRecognizer.touchLocation.y
问题是,我为grayVal获得的值在某些地方直截了当.它称灰度值较大的暗区和亮区是随机的.大约有50%的时间是错误的,我也不知道为什么.
Issue is that the value I get for the grayVal is straight up wrong in some spots. It calls dark areas with a large grayVal and bright areas are random. It's wrong about 50% of the time and I don't know why.
这具有完整的代码,但我发现问题可能出在本节之内
This has the full code but I figured out the problem lies within this section maybe
总体而言,它不返回任何错误,但由于某种原因,它似乎不起作用.
Overall it returns no errors but for some reason it doesn't seem to work.
新材料 这是我缩放图像的方式:
//Calculate point at which pencil is touching with respect to the input image
func pencilTouchHandler(touch: UITouch) {
//Get point at which user is touching
let currentPoint = touch.location(in: self.coordinateSpaceView)
//Excute if touch input is within the image
if currentPoint.x >= origin.x && currentPoint.y >= origin.y
&& currentPoint.x < (origin.x + size.width-1)
&& currentPoint.y < (origin.y + size.height-1){
//Track whether or not touch was within the boundaries of the image
touchInImage = true
//Calculate (x,y) coordinates with respect to input image
//Use ratio of original image dimensions
let xVal = (imageSize.width / size.width)*(currentPoint.x - origin.x)
let yVal = (imageSize.height / size.height)*(currentPoint.y - origin.y)
//Record Touch Location
touchLocation = CGPoint(x: xVal, y: yVal)
lastTouch = currentPoint
}
}
然后我做了
let x = pencilStrokeRecognizer.touchLocation.x
let y = pencilStrokeRecognizer.touchLocation.y
推荐答案
您确定铅笔的X和Y与图片的X和Y相匹配吗?我怀疑不是. 图像可能会按比例显示在屏幕上.图像在屏幕上的位置可能颠倒了,如何存储在内存中.窗口坐标系与屏幕不匹配,可能会在左侧或右侧产生偏移.
Are you sure the X and Y for the pencil match the X and Y for the image? I suspect not. The image might be displayed scaled on the screen. The image might be upside down on the screen to how it is stored in memory. There might be an offset on the left or right side caused by a window coordinate system that doesn't match the screen.
要调试此功能,请尝试更改图像.使用只有四种颜色的图像:白色正方形占屏幕的四分之一,黑色占屏幕的四分之一,灰色占屏幕的四分之一,等等.
To debug this, Try changing the image. Use an image with only four colors: a white square that takes a quarter of the screen, a black that is a quarter of the screen, a grey that is a quarter of the screen, etc.
您应该能够轻松查看图像是否上下颠倒.通过发现过渡发生在错误的位置,您应该能够查看是否存在缩放问题.一旦知道发生了什么,您就应该能够阅读文档以了解正在发生的翻译,从而可以正确地将笔x和y映射到图像x和y.
You should be able to see if the image is upside down easily. You should be able to see if there is a scaling issue by finding that the transition happens in the wrong spot. Once you know what is going on you should be able to read the documentation to find out what translations are happening so you can properly map the pen x and y to the image x and y.
这篇关于OpenCv没有为Xcode中的Apple Pencil位置返回灰度值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!