有效地调整一批 np.array 图像的大小 [英] Efficiently resize batch of np.array images

查看:27
本文介绍了有效地调整一批 np.array 图像的大小的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个 4D np.array 大小 (10000,32,32,3) 代表一组 10000 个 RGB 图像.

I have a 4D np.array size (10000,32,32,3) that represents a set of 10000 RGB images.

如何使用 skimage.transform.resize 或其他函数来有效地调整所有图像的大小,以便将 (32,32) 插值到 (224,224)?我更喜欢用 skimage 来做到这一点,但我愿意接受任何不使用 tf.image.resize_images 的解决方案.

How can I use skimage.transform.resize or other function to resize all images efficiently so that the (32,32) is interpolated to (224,224)? I'd prefer to do this with skimage, but I'm open to any solutions that don't use tf.image.resize_images.

我当前的解决方案是使用 tf.image.resize_images,但它会在我的管道中导致 GPU 内存问题(在 jupyter notebook 中完成后不会释放内存)所以我想要更换它.

My current solution is using tf.image.resize_images, but it's causing GPU memory issues later down in my pipeline (won't free up memory after finishing in jupyter notebook) so I'd like to replace it.

示例:

import tensorflow as tf
X = tf.image.resize_images(X,[224, 224])
with tf.Session() as sess:
    X = X.eval()

推荐答案

我不太可能接受我自己的答案,但似乎一个简单的 for 循环实际上相当快(从 说 ~300% cpu 利用率顶).

I won't likely accept my own answer, but it seems that a simple for loop is actually fairly fast (says ~300% cpu utilization from top).

from skimage.transform import resize

imgs_in = np.random.rand(100, 32, 32, 3)
imgs_out = np.zeros((100,224,224,3))

for n,i in enumerate(imgs_in):
    imgs_out[n,:,:,:] = resize(imgs_in[n,:,:,:], imgs_out.shape[1:], anti_aliasing=True)

print(imgs_out.shape)

似乎比我机器上的 ndi.zoom 快 7-8 倍.我认为尝试使用 multiprocessing 进一步并行化会更好.

Seems to be 7-8x faster than ndi.zoom on my machine. Trying to parallelize this further with multiprocessing would be even better I think.

这篇关于有效地调整一批 np.array 图像的大小的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆