从iOS中的相机返回的图像中读取GPS数据 [英] Reading the GPS data from the image returned by the camera in iOS iphone

查看:257
本文介绍了从iOS中的相机返回的图像中读取GPS数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要获取使用iOS设备的相机拍摄的图像的GPS坐标。我不在乎相机镜像图像,只是用UIImagePickerControllerSourceTypeCamera拍摄的图像。



我已经读过很多stackoverflow的答案,像从UIImage - UIImagePickerController 获取Exif数据,它假设您使用的AssetsLibrary框架,似乎不工作或者使用CoreLocaiton从应用程序本身获取纬度/经度,而不是从图像中获取。



使用CoreLocation 不是选项。这将不会给我的坐标,当按下快门按钮。 (使用基于CoreLocation的解决方案,您需要在启动摄像机视图之前或之后记录坐标,当然,如果设备移动,坐标将会出错。此方法应该与固定设备配合使用。)

我只有iOS5,所以我不需要支持旧的设备。这也适用于商业产品,因此我无法使用 http://code.google.com/p/iphone-exif/



那么,我从iOS5中的相机返回的图像中读取GPS数据的选项是什么?现在我可以想到的是将图像保存到相机胶卷,然后使用AssetsLibrary,但这似乎hokey。



谢谢!






这里是我写的基于Caleb的答案的代码。

  UIImage * image = [info objectForKey:UIImagePickerControllerOriginalImage]; 

NSData * jpeg = UIImageJPEGRepresentation(image,1.0);
CGImageSourceRef source;
source = CGImageSourceCreateWithData((__ bridge CFDataRef)jpeg,NULL);

NSDictionary * metadataNew =(__bridge NSDictionary *)CGImageSourceCopyPropertiesAtIndex(source,0,NULL);

NSLog(@%@,metadataNew);

,我的控制台会显示:

  2012-04-26 14:15:37:137 ferret [2060:1799] {
ColorModel = RGB;
Depth = 8;
Orientation = 6;
PixelHeight = 1936;
PixelWidth = 2592;
{Exif}= {
ColorSpace = 1;
PixelXDimension = 2592;
PixelYDimension = 1936;
};
{JFIF}= {
DensityUnit = 0;
JFIFVersion =(
1,
1
);
XDensity = 1;
YDensity = 1;
};
{TIFF}= {
Orientation = 6;
};
}

没有经纬度。

解决方案

一种可能是让CoreLocation在相机可见时运行。将每个CCLocation与样本的时间一起记录到数组中。当照片回来时,找到它的时间,然后匹配最近的CClocation从数组。



听到kludgy但它会工作。


I need to get the GPS coordinates of an image taken with the iOS device's camera. I do not care about the Camera Roll images, just the image taken with UIImagePickerControllerSourceTypeCamera.

I've read many stackoverflow answers, like Get Exif data from UIImage - UIImagePickerController, which either assumes you are using the AssetsLibrary framework, which doesn't seem to work on camera images, or use CoreLocaiton to get the latitude/longitude from the app itself, not from the image.

Using CoreLocation is not an option. That will not give me the coordinates when the shutter button was pressed. (With the CoreLocation based solutions, you either need to record the coords before you bring up the camera view or after, and of course if the device is moving the coordinates will be wrong. This method should work with a stationary device.)

I am iOS5 only, so I don't need to support older devices. This is also for a commercial product so I cannot use http://code.google.com/p/iphone-exif/.

So, what are my options for reading the GPS data from the image returned by the camera in iOS5? All I can think of right now is to save the image to Camera Roll and then use the AssetsLibrary, but that seems hokey.

Thanks!


Here's the code I wrote based on Caleb's answer.

    UIImage *image =  [info objectForKey:UIImagePickerControllerOriginalImage];

    NSData *jpeg = UIImageJPEGRepresentation(image,1.0);
    CGImageSourceRef  source ;
    source = CGImageSourceCreateWithData((__bridge CFDataRef)jpeg, NULL);

    NSDictionary *metadataNew = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source,0,NULL);  

    NSLog(@"%@",metadataNew);

and my Console shows:

    2012-04-26 14:15:37:137 ferret[2060:1799] {
        ColorModel = RGB;
        Depth = 8;
        Orientation = 6;
        PixelHeight = 1936;
        PixelWidth = 2592;
        "{Exif}" =     {
            ColorSpace = 1;
            PixelXDimension = 2592;
            PixelYDimension = 1936;
        };
        "{JFIF}" =     {
            DensityUnit = 0;
            JFIFVersion =         (
                1,
                1
            );
            XDensity = 1;
            YDensity = 1;
        };
        "{TIFF}" =     {
            Orientation = 6;
        };
    }

No latitude/longitude.

解决方案

One possibility is to leaving CoreLocation running when the camera is visible. Record each CCLocation into an array along with the time of the sample. When the photo comes back, find its time, then match the closest CClocation from the array.

Sounds kludgy but it will work.

这篇关于从iOS中的相机返回的图像中读取GPS数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆