scipy.ndimage.interpolation.zoom使用类似于最近邻居的算法进行缩小 [英] scipy.ndimage.interpolation.zoom uses nearest-neighbor-like algorithm for scaling-down

查看:1556
本文介绍了scipy.ndimage.interpolation.zoom使用类似于最近邻居的算法进行缩小的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在测试scipy的缩放功能时,我发现scailng-down数组的结果与最近邻算法相似,而不是平均。这极大地增加了噪声,并且对于许多应用而言通常是次优的。

While testing scipy's zoom function, I found that the results of scailng-down an array are similar to the nearest-neighbour algorithm, rather than averaging. This increases noise drastically, and is generally suboptimal for many application.

是否存在不使用类似最近邻算法的替代方案,并且在缩小尺寸时会正确平均数组?虽然粗粒度适用于整数缩放因子,但我也需要非整数缩放因子。

Is there an alternative that does not use nearest-neighbor-like algorithm and will properly average the array when downsizing? While coarsegraining works for integer scaling factors, I would need non-integer scaling factors as well.

测试用例:创建一个随机的100 * M x 100 * M数组,对于M = 2..20
将数组缩小M系数3方式:

Test case: create a random 100*M x 100*M array, for M = 2..20 Downscale the array by the factor of M three ways:

1)通过使用scipy的缩放比例因子1 / M
3)来获取MxM块中的均值
2)取得第一个点

1) by taking the mean in MxM blocks 2) by using scipy's zoom with a scaling factor 1/M 3) by taking a first point within a

得到的数组具有相同的平均值,相同的形状,但是scipy的数组的方差与最近邻居的方差一样高。为scipy.zoom采取不同的顺序并没有真正帮助。

Resulting arrays have the same mean, the same shape, but scipy's array has the variance as high as the nearest-neighbor. Taking a different order for scipy.zoom does not really help.

import scipy.ndimage.interpolation
import numpy as np
import matplotlib.pyplot as plt

mean1, mean2, var1, var2, var3  = [],[],[],[],[]
values = range(1,20)  # down-scaling factors

for M in values:
    N = 100  # size of an array 
    a = np.random.random((N*M,N*M))  # large array    

    b = np.reshape(a, (N, M, N, M))  
    b = np.mean(np.mean(b, axis=3), axis=1)
    assert b.shape == (N,N)  #coarsegrained array

    c = scipy.ndimage.interpolation.zoom(a, 1./M, order=3, prefilter = True) 
    assert c.shape == b.shape

    d = a[::M, ::M]  # picking one random point within MxM block
    assert b.shape == d.shape

    mean1.append(b.mean())
    mean2.append(c.mean())
    var1.append(b.var())
    var2.append(c.var())
    var3.append(d.var())

plt.plot(values, mean1, label = "Mean coarsegraining")
plt.plot(values, mean2, label = "mean scipy.zoom")
plt.plot(values, var1, label = "Variance coarsegraining")
plt.plot(values, var2, label = "Variance zoom")
plt.plot(values, var3, label = "Variance Neareset neighbor")
plt.xscale("log")
plt.yscale("log")
plt.legend(loc=0)
plt.show()

编辑:scipy.ndimage.zoom在真实嘈杂图像上的表现也很差

Performance of scipy.ndimage.zoom on a real noisy image is also very poor

原始图片在这里 http://wiz.mit.edu/lena_noisy.png

产生它的代码:<来自PIL的$ p
$ b

The code that produced it:

from PIL import Image
import numpy as np
import matplotlib.pyplot as plt
from scipy.ndimage.interpolation import zoom

im = Image.open("/home/magus/Downloads/lena_noisy.png")
im = np.array(im)

plt.subplot(131)
plt.title("Original")
plt.imshow(im, cmap="Greys_r")

plt.subplot(132)
im2 = zoom(im, 1 / 8.)
plt.title("Scipy zoom 8x")
plt.imshow(im2, cmap="Greys_r", interpolation="none")

im.shape = (64, 8, 64, 8)
im3 = np.mean(im, axis=3)
im3 = np.mean(im3, axis=1)

plt.subplot(133)
plt.imshow(im3, cmap="Greys_r", interpolation="none")
plt.title("averaging over 8x8 blocks")

plt.show()


推荐答案

没有人发布工作答案,所以我将发布我目前使用的解决方案。不是最优雅,但有效。

Nobody posted a working answer, so I will post a solution I currently use. Not the most elegant, but works.

import numpy as np 
import scipy.ndimage
def zoomArray(inArray, finalShape, sameSum=False,
              zoomFunction=scipy.ndimage.zoom, **zoomKwargs):
    """

    Normally, one can use scipy.ndimage.zoom to do array/image rescaling.
    However, scipy.ndimage.zoom does not coarsegrain images well. It basically
    takes nearest neighbor, rather than averaging all the pixels, when
    coarsegraining arrays. This increases noise. Photoshop doesn't do that, and
    performs some smart interpolation-averaging instead.

    If you were to coarsegrain an array by an integer factor, e.g. 100x100 ->
    25x25, you just need to do block-averaging, that's easy, and it reduces
    noise. But what if you want to coarsegrain 100x100 -> 30x30?

    Then my friend you are in trouble. But this function will help you. This
    function will blow up your 100x100 array to a 120x120 array using
    scipy.ndimage zoom Then it will coarsegrain a 120x120 array by
    block-averaging in 4x4 chunks.

    It will do it independently for each dimension, so if you want a 100x100
    array to become a 60x120 array, it will blow up the first and the second
    dimension to 120, and then block-average only the first dimension.

    Parameters
    ----------

    inArray: n-dimensional numpy array (1D also works)
    finalShape: resulting shape of an array
    sameSum: bool, preserve a sum of the array, rather than values.
             by default, values are preserved
    zoomFunction: by default, scipy.ndimage.zoom. You can plug your own.
    zoomKwargs:  a dict of options to pass to zoomFunction.
    """
    inArray = np.asarray(inArray, dtype=np.double)
    inShape = inArray.shape
    assert len(inShape) == len(finalShape)
    mults = []  # multipliers for the final coarsegraining
    for i in range(len(inShape)):
        if finalShape[i] < inShape[i]:
            mults.append(int(np.ceil(inShape[i] / finalShape[i])))
        else:
            mults.append(1)
    # shape to which to blow up
    tempShape = tuple([i * j for i, j in zip(finalShape, mults)])

    # stupid zoom doesn't accept the final shape. Carefully crafting the
    # multipliers to make sure that it will work.
    zoomMultipliers = np.array(tempShape) / np.array(inShape) + 0.0000001
    assert zoomMultipliers.min() >= 1

    # applying scipy.ndimage.zoom
    rescaled = zoomFunction(inArray, zoomMultipliers, **zoomKwargs)

    for ind, mult in enumerate(mults):
        if mult != 1:
            sh = list(rescaled.shape)
            assert sh[ind] % mult == 0
            newshape = sh[:ind] + [sh[ind] // mult, mult] + sh[ind + 1:]
            rescaled.shape = newshape
            rescaled = np.mean(rescaled, axis=ind + 1)
    assert rescaled.shape == finalShape

    if sameSum:
        extraSize = np.prod(finalShape) / np.prod(inShape)
        rescaled /= extraSize
    return rescaled

myar = np.arange(16).reshape((4,4))
rescaled = zoomArray(myar, finalShape=(3, 5))
print(myar)
print(rescaled)

这篇关于scipy.ndimage.interpolation.zoom使用类似于最近邻居的算法进行缩小的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆