将TF记录加载到Keras中 [英] Loading TF Records into Keras

查看:155
本文介绍了将TF记录加载到Keras中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将自定义TFRecord文件加载到我的keras模型中.我试图按照本教程进行操作: https://medium. com/@ moritzkrger/speeding-up-keras-with-tfrecord-datasets-5464f9836c36 ,但适合我的使用.

I am trying to load a custom TFRecord file into my keras model. I attempted to follow this tutorial: https://medium.com/@moritzkrger/speeding-up-keras-with-tfrecord-datasets-5464f9836c36, but adapting for my use.

我的目标是使功能类似于Keras的ImageDataGenerator.我无法使用该功能,因为我特定于生成器无法捕获的图像中的元数据.我不在这里包括该元数据,因为我只需要首先使用基本网络即可.

My goal is to have the functions work similar to ImageDataGenerator from Keras. I cannot use that function because I specific metadata from the images that the generator does not grab. I'm not including that metadata here because I just need the basic network to function first.

我还希望能够将其应用于转学学习应用程序.

I also want to be able to apply this to a transfer learning application.

我一直收到此错误:TypeError: Could not build a TypeSpec for None with type NoneType 我正在使用Tensorflow 2.2

I keep getting this error: TypeError: Could not build a TypeSpec for None with type NoneType I am using Tensorflow 2.2

def _parse_function(serialized):
    features = \
    {
        'image': tf.io.FixedLenFeature([], tf.string),
        'label': tf.io.FixedLenFeature([], tf.int64),
        'shapex': tf.io.FixedLenFeature([], tf.int64),
        'shapey': tf.io.FixedLenFeature([], tf.int64),
    }
    parsed_example = tf.io.parse_single_example(serialized=serialized,
                                                features=features)
    shapex = tf.cast(parsed_example['shapex'], tf.int32)
    shapey = tf.cast(parsed_example['shapey'], tf.int32)
    image_shape = tf.stack([shapex, shapey, 3])
    image_raw = parsed_example['image']
    # Decode the raw bytes so it becomes a tensor with type.
    image = tf.io.decode_raw(image_raw, tf.uint8)
    image = tf.reshape(image, image_shape)
    # Get labels
    label = tf.cast(parsed_example['label'], tf.float32)
    return image, label

def imgs_inputs(type, perform_shuffle=False):
    records_dir = '/path/to/tfrecord/'
    record_paths = [os.path.join(records_dir,record_name) for record_name in os.listdir(records_dir)]
    full_dataset = tf.data.TFRecordDataset(filenames=record_paths)
    full_dataset = full_dataset.map(_parse_function, num_parallel_calls=16)

    dataset_length = (len(list(full_dataset))) #Gets length of datase

    iterator = tf.compat.v1.data.make_one_shot_iterator(databatch)
    image, label = iterator.get_next()
    #labels saved as values ex: [1,2,3], and are now converted to one hot encoded
    label = to_categorical(label)
    return image, label

image, label = imgs_inputs(type ='Train',perform_shuffle=True)

#Combine it with keras
# base_model = MobileNet(weights='imagenet', include_top=False, input_shape=(200,200,3), dropout=.3)
model_input = Input(shape=[200,200,3])

#Build your network
model_output = Flatten(input_shape=(200, 200, 3))(model_input)
model_output = Dense(19, activation='relu')(model_output)

#Create your model
train_model = Model(inputs=model_input, outputs=model_output)

#Compile your model
optimizer = Adam(learning_rate=.001)
train_model.compile(optimizer=optimizer,loss='mean_squared_error',metrics=['accuracy'],target_tensors=[label])

#Train the model
train_model.fit(epochs=10,steps_per_epoch=2)

image返回形状数组(100,200,200,3),这是一批100张图像 label返回shape(100,19)的数组,该数组是一批100个标签(有19个标签)

image returns array of shape (100,200,200,3) which is a batch of 100 images label returns array of shape(100,19) which is a batch of 100 labels (there are 19 labels)

推荐答案

shapexshapey有关的问题,但我不知道为什么. 我设置了shapex = 200shapey=200.然后,我重新编写了该模型,以包括迁移学习.

The issue related to shapex and shapey but I don't know exactly why. I set shapex = 200 and shapey=200. Then I rewrote the model to include the transfer learning.

base_model = MobileNet(weights='imagenet', include_top=False, input_shape=(200,200,3), dropout=.3)
x = base_model.output
types = Dense(19,activation='softmax')(x)

model = Model(inputs=base_model.input,outputs=types)

model.compile(
    optimizer='adam',
    loss = 'sparse_categorical_crossentropy',
    metrics=['accuracy']
history = model.fit(get_batches(), steps_per_epoch=1000, epochs=10)

I found everything I needed on this Google Colab:
[https://colab.research.google.com/github/GoogleCloudPlatform/training-data-analyst/blob/master/courses/fast-and-lean-data-science/04_Keras_Flowers_transfer_learning_solution.ipynb#scrollTo=XLJNVGwHUDy1][1]


  [1]: https://colab.research.google.com/github/GoogleCloudPlatform/training-data-analyst/blob/master/courses/fast-and-lean-data-science/04_Keras_Flowers_transfer_learning_solution.ipynb#scrollTo=XLJNVGwHUDy1

这篇关于将TF记录加载到Keras中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆