在numpy中对大型3D图像进行下采样 [英] Downsampling large 3D image in numpy
问题描述
我需要通过任意非整数因子对由一系列2d tiff切片组成的大型3D图像(30GB +)进行下采样。 scipy.ndimage.zoom适用于适合RAM的输入图像。
I need to downsample large 3D images (30GB +) that are composed of a series of 2d tiff slices by arbitrary non-interger factors. scipy.ndimage.zoom works well for input images that fit into RAM.
我正在考虑读取部分堆栈并使用scipy.ndimage_map_coordintes来获取插值像素坐标。另一个想法是使用numpy.memmap创建一个内存映射数组并对此执行scipy.ndimage.zoom。
I was thinking about reading in parts of the stack and using scipy.ndimage_map_coordintes to get the interpolated pixel coordinates. Another idea was to create a memory-mapped array using numpy.memmap and performing scipy.ndimage.zoom on this.
在我开始之前,有没有人有更好的方法这个?
Does anyone have any better approaches before I go ahead with this?
推荐答案
所以我通过查看ImageJ源代码找出了要做的事情。我在这里发布它是为了帮助其他人:
So I worked out what to do by looking at ImageJ source code. I posted it here in case it helps anyone else:
import SimpleITK as sitk
import cv2
import numpy as np
def downsample_large_volume(img_path_list, input_voxel_size, output_voxel_size):
scale = input_voxel_size / output_voxel_size
resampled_zs = []
#Resample z slices
for img_path in img_path_list:
z_slice_arr = cv2.imread(img_path, cv2.CV_LOAD_IMAGE_GRAYSCALE)
z_slice_resized = cv2.resize(z_slice_arr, (0, 0), fx=scale, fy=scale, interpolation=cv2.INTER_AREA)
resampled_zs.append(z_slice_resized) # Or save to disk to save RAM and use np.memmap for xz scaling
temp_arr = np.dstack(resampled_zs) # We seem to be in yxz space now
final_scaled_slices = []
# Resample xz plane at each y
for y in range(temp_arr.shape[0]):
xz_pane = temp_arr[y, :, :]
scaled_xz = cv2.resize(xz_pane, (0, 0), fx=scale, fy=1, interpolation=cv2.INTER_AREA)
final_scaled_slices.append(scaled_xz)
final_array = np.dstack(final_scaled_slices)
img = sitk.GetImageFromArray(np.swapaxes(np.swapaxes(final_array, 0, 1), 1, 2))
sitk.WriteImage(img, 'scaled_by_pixel.nrrd')
这篇关于在numpy中对大型3D图像进行下采样的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!