使用相机拍摄的照片不包含任何ALAsset元数据 [英] Photo taken with camera does not contain any ALAsset metadata

查看:146
本文介绍了使用相机拍摄的照片不包含任何ALAsset元数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

最奇怪的事情正在发生。我有一个操作表,可以让用户选择使用相机拍摄照片或从相机胶卷中选择一张。在UIImagePicker从选择中返回后,我使用ALAssetsLibrary来确定照片中嵌入的GPS信息。从相机胶卷中选择照片效果很好,我可以检索GPS信息。但是,使用相机拍照并不提供GPS信息,实际上我根本没有元数据。有没有人知道我在这里做错了什么?



以下代码:

 <$ c $($)$  -  $($)$($)$($) 
if([mediaType isEqualToString:(__ bridge NSString *)kUTTypeImage])
{
void(^ ALAssetsLibraryAssetForURLResultBlock)(ALAsset *)= ^(ALAsset * asset)
{
//获取图片元数据
NSDictionary * metadata = asset.defaultRepresentation.metadata;
NSLog(@Image Meta Data:%@,metadata);

//获得坐标
CLLocation *位置= [资产值特性:ALAssetPropertyLocation];
NSLog(@coordLat:%f,coordLon:%f,location.coordinate.latitude,location.coordinate.longitude);

//在这里做更多的事情 - 其余的代码缩减以保持这个问题简短

};
NSURL * assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary * library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:ALAssetsLibraryAssetForURLResultBlock
failureBlock:^(NSError * error)
{
//处理错误
}];

//其余代码被剪掉以保持这个问题的简短性

解释,使用相机时会输出以下内容。

  2012-04-15 17:58:28.032 MyApp [511:707 ]图片元数据:(null)
2012-04-15 17:58:28.041 MyApp [511:707] coordLat:0.000000,coordLon:0.000000
  2012-04-15 17:57:03.286 MyApp [511:707]图像元数据:{
ColorModel = RGB;
DPIHeight = 72;
DPIWidth = 72;
深度= 8;
方向= 6;
PixelHeight = 1936;
PixelWidth = 2592;
{Exif}= {
ApertureValue =2.970854;
BrightnessValue =2.886456;
ColorSpace = 1;
ComponentsConfiguration =(
1,
2,
3,
0
);
DateTimeDigitized =2012:04:15 17:24:02;
DateTimeOriginal =2012:04:15 17:24:02;
ExifVersion =(
2,
2,
1
);
ExposureMode = 0;
ExposureProgram = 2;
ExposureTime =0.06666667;
FNumber =2.8;
Flash = 24;
FlashPixVersion =(
1,
0
);
FocalLength =3.85;
ISOSpeedRatings =(
$
);
MeteringMode = 5;
PixelXDimension = 2592;
PixelYDimension = 1936;
SceneCaptureType = 0;
SensingMethod = 2;
锐度= 2;
ShutterSpeedValue =3.9112;
SubjectArea =(
1295,
967,
699,
696
);
WhiteBalance = 0;
};
{GPS}= {
Altitude =14.9281;
AltitudeRef = 0;
ImgDirection =107.4554;
ImgDirectionRef = T;
Latitude =32.7366666666667;
LatitudeRef = N;
经度=71.679;
LongitudeRef = W;
TimeStamp =21:26:20.00;
};
{TIFF}= {
DateTime =2012:04:15 17:24:02;
Make = Apple;
Model =iPhone 4;
方向= 6;
ResolutionUnit = 2;
Software =5.0.1;
XResolution = 72;
YResolution = 72;
_YCbCrPositioning= 1;
};

2012-04-15 17:57:03.302 MyApp [511:707] coordLat:32.7366666666667,coordLon:-71.679

PS - 我在使用xCode 4.3 w / ARC

code> - (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info



照片

  ALAssetsLibrary * library = [[ALAssetsLibrary alloc] init]; 
[library writeImageToSavedPhotosAlbum:image.CGImage
metadata:[info objectForKey:UIImagePickerControllerMediaMetadata]
completionBlock:^(NSURL * assetURL,NSError * error){
NSLog(@assetURL% @,assetURL);
}];


The weirdest thing is happening. I have an action sheet which gives the user the choice to either take a photo with the camera or choose one from the camera roll. Upon the UIImagePicker returning from selection I use the ALAssetsLibrary to determine the GPS information embedded in the photo. Choosing a photo from the camera roll works perfectly and I am able to retrieve the GPS information. However, taking a photo with the camera provides absolutely no GPS information, in fact I have no metadata at all. Does anyone know what I'm doing wrong here?

Code below:

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{    
    NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
    if([mediaType isEqualToString:(__bridge NSString *)kUTTypeImage])
    {        
        void (^ALAssetsLibraryAssetForURLResultBlock)(ALAsset *) = ^(ALAsset *asset)
        {
            // get images metadata
            NSDictionary *metadata = asset.defaultRepresentation.metadata;
            NSLog(@"Image Meta Data: %@",metadata);

            // get coords 
            CLLocation *location = [asset valueForProperty:ALAssetPropertyLocation];
            NSLog(@"coordLat: %f , coordLon: %f", location.coordinate.latitude, location.coordinate.longitude);

            // do more here - rest of code snipped to keep this question short

    };
    NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    [library assetForURL:assetURL
             resultBlock:ALAssetsLibraryAssetForURLResultBlock
            failureBlock:^(NSError *error) 
            {
                // handle error 
            }];

    // rest of code snipped to keep this question short

As I explained, the following is outputted when using the camera.

2012-04-15 17:58:28.032 MyApp[511:707] Image Meta Data: (null)
2012-04-15 17:58:28.041 MyApp[511:707] coordLat: 0.000000 , coordLon: 0.000000

However, if I choose an existing photo, or exit out of the app, take a new photo with the camera, then go back into the app and select that photo from the camera roll i get the following output from NSLog.

2012-04-15 17:57:03.286 MyApp[511:707] Image Meta Data: {
ColorModel = RGB;
DPIHeight = 72;
DPIWidth = 72;
Depth = 8;
Orientation = 6;
PixelHeight = 1936;
PixelWidth = 2592;
"{Exif}" =     {
    ApertureValue = "2.970854";
    BrightnessValue = "2.886456";
    ColorSpace = 1;
    ComponentsConfiguration =         (
        1,
        2,
        3,
        0
    );
    DateTimeDigitized = "2012:04:15 17:24:02";
    DateTimeOriginal = "2012:04:15 17:24:02";
    ExifVersion =         (
        2,
        2,
        1
    );
    ExposureMode = 0;
    ExposureProgram = 2;
    ExposureTime = "0.06666667";
    FNumber = "2.8";
    Flash = 24;
    FlashPixVersion =         (
        1,
        0
    );
    FocalLength = "3.85";
    ISOSpeedRatings =         (
        80
    );
    MeteringMode = 5;
    PixelXDimension = 2592;
    PixelYDimension = 1936;
    SceneCaptureType = 0;
    SensingMethod = 2;
    Sharpness = 2;
    ShutterSpeedValue = "3.9112";
    SubjectArea =         (
        1295,
        967,
        699,
        696
    );
    WhiteBalance = 0;
};
"{GPS}" =     {
    Altitude = "14.9281";
    AltitudeRef = 0;
    ImgDirection = "107.4554";
    ImgDirectionRef = T;
    Latitude = "32.7366666666667";
    LatitudeRef = N;
    Longitude = "71.679";
    LongitudeRef = W;
    TimeStamp = "21:26:20.00";
};
"{TIFF}" =     {
    DateTime = "2012:04:15 17:24:02";
    Make = Apple;
    Model = "iPhone 4";
    Orientation = 6;
    ResolutionUnit = 2;
    Software = "5.0.1";
    XResolution = 72;
    YResolution = 72;
    "_YCbCrPositioning" = 1;
};
}
2012-04-15 17:57:03.302 MyApp[511:707] coordLat: 32.7366666666667 , coordLon: -71.679

PS - I'm using xCode 4.3 w/ ARC

解决方案

In - (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info

Try this when saving the Photo

    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    [library writeImageToSavedPhotosAlbum:image.CGImage
                                 metadata:[info objectForKey:UIImagePickerControllerMediaMetadata]
                          completionBlock:^(NSURL *assetURL, NSError *error) {
        NSLog(@"assetURL %@", assetURL);
    }];

这篇关于使用相机拍摄的照片不包含任何ALAsset元数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆