从iOS iphone中相机返回的图像中读取GPS数据 [英] Reading the GPS data from the image returned by the camera in iOS iphone

查看:21
本文介绍了从iOS iphone中相机返回的图像中读取GPS数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要获取使用 iOS 设备的相机拍摄的图像的 GPS 坐标.我不关心相机胶卷图像,只关心使用 UIImagePickerControllerSourceTypeCamera 拍摄的图像.

I need to get the GPS coordinates of an image taken with the iOS device's camera. I do not care about the Camera Roll images, just the image taken with UIImagePickerControllerSourceTypeCamera.

我已经阅读了很多 stackoverflow 的答案,例如 Get Exif data from UIImage - UIImagePickerController,假设您正在使用 AssetsLibrary 框架,该框架似乎不适用于相机图像,或者使用 CoreLocaiton 从应用程序本身而不是图像中获取纬度/经度.

I've read many stackoverflow answers, like Get Exif data from UIImage - UIImagePickerController, which either assumes you are using the AssetsLibrary framework, which doesn't seem to work on camera images, or use CoreLocaiton to get the latitude/longitude from the app itself, not from the image.

使用 CoreLocation 不是一种选择.按下快门按钮时,这不会给我坐标.(使用基于 CoreLocation 的解决方案,您需要在打开相机视图之前或之后记录坐标,当然如果设备正在移动,坐标将是错误的.这种方法应该适用于固定设备.)

Using CoreLocation is not an option. That will not give me the coordinates when the shutter button was pressed. (With the CoreLocation based solutions, you either need to record the coords before you bring up the camera view or after, and of course if the device is moving the coordinates will be wrong. This method should work with a stationary device.)

我只有iOS5,所以不需要支持旧设备.这也适用于商业产品,所以我不能使用 http://code.google.com/p/iphone-exif/.

I am iOS5 only, so I don't need to support older devices. This is also for a commercial product so I cannot use http://code.google.com/p/iphone-exif/.

那么,在 iOS5 中从相机返回的图像中读取 GPS 数据有哪些选择?我现在能想到的就是将图像保存到相机胶卷,然后使用 AssetsLibrary,但这似乎很糟糕.

So, what are my options for reading the GPS data from the image returned by the camera in iOS5? All I can think of right now is to save the image to Camera Roll and then use the AssetsLibrary, but that seems hokey.

谢谢!

这是我根据 Caleb 的回答编写的代码.

Here's the code I wrote based on Caleb's answer.

    UIImage *image =  [info objectForKey:UIImagePickerControllerOriginalImage];

    NSData *jpeg = UIImageJPEGRepresentation(image,1.0);
    CGImageSourceRef  source ;
    source = CGImageSourceCreateWithData((__bridge CFDataRef)jpeg, NULL);

    NSDictionary *metadataNew = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source,0,NULL);  

    NSLog(@"%@",metadataNew);

我的控制台显示:

    2012-04-26 14:15:37:137 ferret[2060:1799] {
        ColorModel = RGB;
        Depth = 8;
        Orientation = 6;
        PixelHeight = 1936;
        PixelWidth = 2592;
        "{Exif}" =     {
            ColorSpace = 1;
            PixelXDimension = 2592;
            PixelYDimension = 1936;
        };
        "{JFIF}" =     {
            DensityUnit = 0;
            JFIFVersion =         (
                1,
                1
            );
            XDensity = 1;
            YDensity = 1;
        };
        "{TIFF}" =     {
            Orientation = 6;
        };
    }

没有纬度/经度.

推荐答案

一种可能性是在相机可见时让 CoreLocation 运行.将每个 CCLocation 连同样本的时间一起记录到一个数组中.当照片返回时,找到它的时间,然后匹配数组中最近的 CClocation.

One possibility is to leaving CoreLocation running when the camera is visible. Record each CCLocation into an array along with the time of the sample. When the photo comes back, find its time, then match the closest CClocation from the array.

听起来很笨拙,但它会起作用.

Sounds kludgy but it will work.

这篇关于从iOS iphone中相机返回的图像中读取GPS数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆