使用相机拍摄的照片不包含任何ALAsset元数据 [英] Photo taken with camera does not contain any ALAsset metadata
问题描述
以下代码:
<$ c $($)$ - $($)$($)$($)
if([mediaType isEqualToString:(__ bridge NSString *)kUTTypeImage])
{
void(^ ALAssetsLibraryAssetForURLResultBlock)(ALAsset *)= ^(ALAsset * asset)
{
//获取图片元数据
NSDictionary * metadata = asset.defaultRepresentation.metadata;
NSLog(@Image Meta Data:%@,metadata);
//获得坐标
CLLocation *位置= [资产值特性:ALAssetPropertyLocation];
NSLog(@coordLat:%f,coordLon:%f,location.coordinate.latitude,location.coordinate.longitude);
//在这里做更多的事情 - 其余的代码缩减以保持这个问题简短
};
NSURL * assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary * library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:ALAssetsLibraryAssetForURLResultBlock
failureBlock:^(NSError * error)
{
//处理错误
}];
//其余代码被剪掉以保持这个问题的简短性
解释,使用相机时会输出以下内容。
2012-04-15 17:58:28.032 MyApp [511:707 ]图片元数据:(null)
$ p但是,如果我选择现有照片或退出应用程序,请使用相机拍摄新照片,然后返回应用程序并从照片中选择该照片相机卷我从NSLog获得以下输出。
2012-04-15 17:58:28.041 MyApp [511:707] coordLat:0.000000,coordLon:0.000000
2012-04-15 17:57:03.286 MyApp [511:707]图像元数据:{
ColorModel = RGB;
DPIHeight = 72;
DPIWidth = 72;
深度= 8;
方向= 6;
PixelHeight = 1936;
PixelWidth = 2592;
{Exif}= {
ApertureValue =2.970854;
BrightnessValue =2.886456;
ColorSpace = 1;
ComponentsConfiguration =(
1,
2,
3,
0
);
DateTimeDigitized =2012:04:15 17:24:02;
DateTimeOriginal =2012:04:15 17:24:02;
ExifVersion =(
2,
2,
1
);
ExposureMode = 0;
ExposureProgram = 2;
ExposureTime =0.06666667;
FNumber =2.8;
Flash = 24;
FlashPixVersion =(
1,
0
);
FocalLength =3.85;
ISOSpeedRatings =(
$
);
MeteringMode = 5;
PixelXDimension = 2592;
PixelYDimension = 1936;
SceneCaptureType = 0;
SensingMethod = 2;
锐度= 2;
ShutterSpeedValue =3.9112;
SubjectArea =(
1295,
967,
699,
696
);
WhiteBalance = 0;
};
{GPS}= {
Altitude =14.9281;
AltitudeRef = 0;
ImgDirection =107.4554;
ImgDirectionRef = T;
Latitude =32.7366666666667;
LatitudeRef = N;
经度=71.679;
LongitudeRef = W;
TimeStamp =21:26:20.00;
};
{TIFF}= {
DateTime =2012:04:15 17:24:02;
Make = Apple;
Model =iPhone 4;
方向= 6;
ResolutionUnit = 2;
Software =5.0.1;
XResolution = 72;
YResolution = 72;
_YCbCrPositioning= 1;
};
2012-04-15 17:57:03.302 MyApp [511:707] coordLat:32.7366666666667,coordLon:-71.679
PS - 我在使用xCode 4.3 w / ARC
code> - (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
照片
ALAssetsLibrary * library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:image.CGImage
metadata:[info objectForKey:UIImagePickerControllerMediaMetadata]
completionBlock:^(NSURL * assetURL,NSError * error){
NSLog(@assetURL% @,assetURL);
}];
The weirdest thing is happening. I have an action sheet which gives the user the choice to either take a photo with the camera or choose one from the camera roll. Upon the UIImagePicker returning from selection I use the ALAssetsLibrary to determine the GPS information embedded in the photo. Choosing a photo from the camera roll works perfectly and I am able to retrieve the GPS information. However, taking a photo with the camera provides absolutely no GPS information, in fact I have no metadata at all. Does anyone know what I'm doing wrong here?
Code below:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if([mediaType isEqualToString:(__bridge NSString *)kUTTypeImage])
{
void (^ALAssetsLibraryAssetForURLResultBlock)(ALAsset *) = ^(ALAsset *asset)
{
// get images metadata
NSDictionary *metadata = asset.defaultRepresentation.metadata;
NSLog(@"Image Meta Data: %@",metadata);
// get coords
CLLocation *location = [asset valueForProperty:ALAssetPropertyLocation];
NSLog(@"coordLat: %f , coordLon: %f", location.coordinate.latitude, location.coordinate.longitude);
// do more here - rest of code snipped to keep this question short
};
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:ALAssetsLibraryAssetForURLResultBlock
failureBlock:^(NSError *error)
{
// handle error
}];
// rest of code snipped to keep this question short
As I explained, the following is outputted when using the camera.
2012-04-15 17:58:28.032 MyApp[511:707] Image Meta Data: (null)
2012-04-15 17:58:28.041 MyApp[511:707] coordLat: 0.000000 , coordLon: 0.000000
However, if I choose an existing photo, or exit out of the app, take a new photo with the camera, then go back into the app and select that photo from the camera roll i get the following output from NSLog.
2012-04-15 17:57:03.286 MyApp[511:707] Image Meta Data: {
ColorModel = RGB;
DPIHeight = 72;
DPIWidth = 72;
Depth = 8;
Orientation = 6;
PixelHeight = 1936;
PixelWidth = 2592;
"{Exif}" = {
ApertureValue = "2.970854";
BrightnessValue = "2.886456";
ColorSpace = 1;
ComponentsConfiguration = (
1,
2,
3,
0
);
DateTimeDigitized = "2012:04:15 17:24:02";
DateTimeOriginal = "2012:04:15 17:24:02";
ExifVersion = (
2,
2,
1
);
ExposureMode = 0;
ExposureProgram = 2;
ExposureTime = "0.06666667";
FNumber = "2.8";
Flash = 24;
FlashPixVersion = (
1,
0
);
FocalLength = "3.85";
ISOSpeedRatings = (
80
);
MeteringMode = 5;
PixelXDimension = 2592;
PixelYDimension = 1936;
SceneCaptureType = 0;
SensingMethod = 2;
Sharpness = 2;
ShutterSpeedValue = "3.9112";
SubjectArea = (
1295,
967,
699,
696
);
WhiteBalance = 0;
};
"{GPS}" = {
Altitude = "14.9281";
AltitudeRef = 0;
ImgDirection = "107.4554";
ImgDirectionRef = T;
Latitude = "32.7366666666667";
LatitudeRef = N;
Longitude = "71.679";
LongitudeRef = W;
TimeStamp = "21:26:20.00";
};
"{TIFF}" = {
DateTime = "2012:04:15 17:24:02";
Make = Apple;
Model = "iPhone 4";
Orientation = 6;
ResolutionUnit = 2;
Software = "5.0.1";
XResolution = 72;
YResolution = 72;
"_YCbCrPositioning" = 1;
};
}
2012-04-15 17:57:03.302 MyApp[511:707] coordLat: 32.7366666666667 , coordLon: -71.679
PS - I'm using xCode 4.3 w/ ARC
In - (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
Try this when saving the Photo
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:image.CGImage
metadata:[info objectForKey:UIImagePickerControllerMediaMetadata]
completionBlock:^(NSURL *assetURL, NSError *error) {
NSLog(@"assetURL %@", assetURL);
}];
这篇关于使用相机拍摄的照片不包含任何ALAsset元数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!