数据增强图像数据生成器Keras语义分割 [英] Data Augmentation Image Data Generator Keras Semantic Segmentation

查看:825
本文介绍了数据增强图像数据生成器Keras语义分割的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在对某些图像数据拟合全卷积网络,以便使用Keras进行语义分割.但是,我遇到了一些过拟合的问题.我没有那么多数据,我想进行数据扩充.但是,由于我想按像素进行分类,因此我需要进行诸如翻转,旋转和平移之类的增强,以同时应用于特征图像和标签图像.理想情况下,我想使用Keras ImageDataGenerator进行即时转换.但是,据我所知,您无法同时对要素数据和标签数据进行等效的转换.

I'm fitting full convolutional network on some image data for semantic segmentation using Keras. However, I'm having some problems overfitting. I don't have that much data and I want to do data augmentation. However, as I want to do pixel-wise classification, I need any augmentations like flips, rotations, and shifts to apply to both feature images and the label images. Ideally I'd like to use the Keras ImageDataGenerator for on-the-fly transformations. However, as far as I can tell, you cannot do equivalent transformations on both the feature and label data.

有人知道是否是这种情况吗?如果没有,有人有什么想法吗?否则,我将使用其他工具来创建更大的数据集,然后一次全部填充.

Does anyone know if this is the case and if not, does anyone have any ideas? Otherwise, I'll use other tools to create a larger dataset and just feed it in all at once.

谢谢!

推荐答案

有一些关于扩展ImageDataGenerator的工作,它们可以针对这些类型的情况更加灵活(请参见

There are works on extending ImageDataGenerator to be more flexible for exactly these type of cases (see in this issue on Github for examples).

此外,正如Mikael Rousson在评论中提到的那样,您可以轻松地自己创建ImageDataGenerator的版本,同时利用其许多内置函数来简化操作.这是我用于图像去噪问题的示例代码,其中我使用随机裁剪+加性噪声即时生成干净且嘈杂的图像对.您可以轻松地对此进行修改,以添加其他类型的扩充.之后,您可以使用 Model.fit_generator 来使用这些方法进行训练.

Additionally, as mentioned by Mikael Rousson in the comments, you can easily create your own version of ImageDataGenerator yourself, while leveraging many of its built-in functions to make it easier. Here is an example code I've used for an image denoising problem, where I use random crops + additive noise to generate clean and noisy image pairs on the fly. You could easily modify this to add other types of augmentations. After which, you can use Model.fit_generator to train using these methods.

from keras.preprocessing.image import load_img, img_to_array, list_pictures

def random_crop(image, crop_size):
    height, width = image.shape[1:]
    dy, dx = crop_size
    if width < dx or height < dy:
        return None
    x = np.random.randint(0, width - dx + 1)
    y = np.random.randint(0, height - dy + 1)
    return image[:, y:(y+dy), x:(x+dx)]

def image_generator(list_of_files, crop_size, to_grayscale=True, scale=1, shift=0):
    while True:
        filename = np.random.choice(list_of_files)
        try:
            img = img_to_array(load_img(filename, to_grayscale))
        except:
            return
        cropped_img = random_crop(img, crop_size)
        if cropped_img is None:
            continue
        yield scale * cropped_img - shift
def corrupted_training_pair(images, sigma):
    for img in images:
        target = img
        if sigma > 0:
            source = img + np.random.normal(0, sigma, img.shape)/255.0
        else:
            source = img
        yield (source, target)
def group_by_batch(dataset, batch_size):
    while True:
        try:
            sources, targets = zip(*[next(dataset) for i in xrange(batch_size)])
            batch = (np.stack(sources), np.stack(targets))
            yield batch
        except:
            return
def load_dataset(directory, crop_size, sigma, batch_size):
    files = list_pictures(directory)
    generator = image_generator(files, crop_size, scale=1/255.0, shift=0.5)
    generator = corrupted_training_pair(generator, sigma)
    generator = group_by_batch(generator, batch_size)
    return generator

然后您可以像这样使用上面的内容:

You can then use the above like so:

train_set = load_dataset('images/train', (patch_height, patch_width), noise_sigma, batch_size)
val_set = load_dataset('images/val', (patch_height, patch_width), noise_sigma, batch_size)
model.fit_generator(train_set, samples_per_epoch=batch_size * 1000, nb_epoch=nb_epoch, validation_data=val_set, nb_val_samples=1000)

这篇关于数据增强图像数据生成器Keras语义分割的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆