截断核心数据NSData对象 [英] Truncated Core Data NSData objects

查看:119
本文介绍了截断核心数据NSData对象的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在NSData *对象中保存双精度数组,该对象在Core Data(SQLite)数据模型中作为二进制属性持久化。我这样做是为了存储抽样数据在iPhone应用程序中绘图。有时,当二进制对象中有超过300个双精度值时,并不是所有的双精度值都保存到磁盘。当我退出并重新启动我的应用程序时,可能只有25个数据点已持续或多达300个。



使用NSQLitePragmasOption with synchronous = FULL,有所不同。很难说,因为错误是间歇性的。



由于使用synchronous = FULL导致性能问题的警告,我寻求建议和指针。 p>

感谢。



[[编辑:这里是代码 >

-addToCache的(尚未实现的)意图是将每个新数据添加到高速缓存,但只定期刷新(故障?)数据对象。



来自Data.m

 

@dynamic dataSet; //数据实体的NSData *属性

- (void)addDatum:(double_t)datum
{
DLog(@ - [Data addDatum:%f] );
[self addToCache:datum];
}

- (void)addToCache:(double_t)datum
{
if(cache == nil)
{
cache = [NSMutableData dataWithData:[self dataSet]];
[cache retain];
}
[cache appendBytes:&datum length:sizeof(double_t)];
DLog(@ - [Data addToCache:%f] ... [cache length] =%d; cache =%p,datum,[cache length],cache);
[self flushCache];
}

- (void)wrapup
{
DLog(@ - [Data wrapup]);
[self flushCache];
[cache release];
cache = nil;
DLog(@[self isFault] =%@,[self isFault]?@YES:@NO); // [self isFault] is always NO。
}

- (void)flushCache
{
DLog(@刷新缓存到存储);
[self setDataSet:cache];
DLog(@ - [Data flushCache:] [[self dataSet] length] =%d,[[self dataSet] length]);
}

- (double *)bytes
{
return(double *)[[self dataSet] bytes];
}

- (NSInteger)count
{
return [[self dataSet] length] / sizeof(double);
}

- (void)dump
{
ALog(@Dump Data);
NSInteger numDataPoints = [self count];
double * data =(double *)[self bytes];
ALog(@numDataPoints =%d,numDataPoints);
for(int i = 0; i


解决方案

为了做到这一点我的NSManagedObject(称为Data)有一个NSData属性和一个NSMutableData ivar。我的应用程序从传感器取样数据,并将每个数据点附加到数据集 - 这就是为什么我需要这个设计。



在每个新数据点附加到NSMutableData,然后NSData属性设置为NSMutableData。



我怀疑因为NSData指针没有改变(虽然它的内容是),Core Data没有意识到改变的量。在NSManagedObjectContext上调用-hasChanged表明,更改,并调用-updatedObjects甚至列出Data对象已更改,但正在写入的实际数据似乎已被截断(有时)。



要解决这个我改变了一些东西。新的数据点仍然附加到NSMutableData NSData属性只在采样完成时设置。这意味着崩溃可能会导致截断的数据 - 但在大多数情况下,这项工作似乎解决了这个问题。



注意: bug总是间歇性的,所以有可能仍然存在 - 但只是更难以表现。


I am saving arrays of doubles in an NSData* object that is persisted as a binary property in a Core Data (SQLite) data model. I am doing this to store sampled data for graphing in an iPhone app. Sometimes when there are more than 300 doubles in the binary object not all the doubles are getting saved to disk. When I quit and relaunch my app there may be as few as 25 data points that have persisted or as many as 300.

Using NSSQLitePragmasOption with synchronous = FULL and this may be making a difference. It is hard to tell, as bug is intermittent.

Given the warnings about performance problems as a result of using synchronous = FULL, I am seeking advice and pointers.

Thanks.

[[Edit: here is code.]]

The (as yet unrealized) intent of -addToCache: is to add each new datum to the cache but only flush (fault?) Data object periodically.

From Data.m


@dynamic dataSet; // NSData * attribute of Data entity

 - (void) addDatum:(double_t)datum
    {
    DLog(@"-[Data addDatum:%f]", datum);
    [self addToCache:datum];
    }

- (void) addToCache:(double_t)datum
    {
    if (cache == nil)
        {
        cache = [NSMutableData dataWithData:[self dataSet]];
        [cache retain];
        }
    [cache appendBytes:&datum length:sizeof(double_t)];
    DLog(@"-[Data addToCache:%f] ... [cache length] = %d; cache = %p", datum, [cache length], cache);
    [self flushCache];
    }

- (void) wrapup
    {
    DLog(@"-[Data wrapup]");
    [self flushCache];
    [cache release];
    cache = nil;
    DLog(@"[self isFault] = %@", [self isFault] ? @"YES" : @"NO"); // [self isFault] is always NO.
    }

- (void) flushCache
    {
    DLog(@"flushing cache to store");
    [self setDataSet:cache];
    DLog(@"-[Data flushCache:] [[self dataSet] length] = %d", [[self dataSet] length]);
    }

- (double*) bytes
    {
    return (double*)[[self dataSet] bytes];
    }

- (NSInteger) count
    {
    return [[self dataSet] length]/sizeof(double);
    }

- (void) dump
    {
    ALog(@"Dump Data");
    NSInteger numDataPoints = [self count];
    double *data = (double*)[self bytes];
    ALog(@"numDataPoints = %d", numDataPoints);
    for (int i = 0; i 

解决方案

I was trying to get behavior as if my Core Data entity could have an NSMutableData attribute. To do this my NSManagedObject (called Data) had an NSData attribute and an NSMutableData ivar. My app takes sample data from a sensor and appends each data point to the data set - this is why I needed this design.

On each new data point was appended to the NSMutableData and then the NSData attribute was set to the NSMutableData.

I suspect that because the NSData pointer wasn't changing (though its content was), that Core Data did not appreciate the amount of change. Calling -hasChanged on the NSManagedObjectContext showed that there had been changes, and calling -updatedObjects even listed the Data object as having changed. But the actual data that was being written seems to have been truncated (sometimes).

To work around this I changed things slightly. New data points are still appended to NSMutableData but NSData attribute is only set when sampling is completed. This means that there is a chance that a crash might result in truncated data - but for the most part this work around seems to have solved the problem.

Caveat emptor: the bug was always intermittent, so it is possible that is still there - but just harder to manifest.

这篇关于截断核心数据NSData对象的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆