如何在Matlab中转移点? [英] How imwarp transfer points in Matlab?

查看:277
本文介绍了如何在Matlab中转移点?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用Matlab将图像转换为目标图像。我有几何变换(tform)。

I am using Matlab to transform an image to target image. I have geometric transformation(tform).

例如这是我的'tform':

for example this is my 'tform':

    1.0235    0.0022   -0.0607         0
   -0.0276    1.0002    0.0089         0
   -0.0170   -0.0141    1.1685         0
   12.8777    5.0311  -70.0325    1.0000

在Matlab2013中,可以使用imwarp轻松完成:

In Matlab2013, it is possible to do it easily using imwarp:

%nii = 3D MR Image
I = nii.img; 
dii=nii.hdr.dime.pixdim(2:4);
Rfixed=imref3d(size(I),dii(2),dii(1),dii(3));    
new_img= imwarp(old_img, Rfixed, tform, 'OutputView', Rfixed);

结果是完美的使用imwarp(图像中的红肺)

the result is perfect using imwarp (red lung in the image)

我需要知道imwarp是如何工作的,然后我写了自己的函数

I need to know how imwarp is working, Then I wrote my own function

function [new_img] = aff3d(old_img, tform, range_x, range_y, range_z)

   [U, V, W] = ndgrid(range_x, range_y, range_z);
   xyz = [reshape(U,[],1)';reshape(V,[],1)';reshape(W,[],1)'];
   xyz = [xyz; ones(1,size(xyz,2))];


   uvw = tform.T * xyz;
   % avoid homogeneous coordinate  
   uvw = uvw(1:3,:)';


   xi = reshape(uvw(:,1), length(range_x),length(range_y),length(range_z));
   yi = reshape(uvw(:,2), length(range_x),length(range_y),length(range_z));
   zi = reshape(uvw(:,3), length(range_x),length(range_y),length(range_z));

   old_img = single(old_img);
   new_img = interp3(old_img,yi,xi,zi,'linear');

   ii = find(isnan(new_img));
   if(~isempty(ii))
      new_img(ii) = 0;
   end
end

我的函数的结果( more info )与imwarp输出不匹配(红肺)没有找到正确的地方),有人可以帮助我吗?

the result of my function (more info) is not match with imwarp output ( red lung is not locating in a correct place), anybody can help me?

推荐答案

正如安德所建议的那样,尝试乘以逆变换:

As was suggested by Ander, try multiplying by the inverse transformation:

Tinv = tform.invert();
TinvMatrix = Tinv.T;

所以你的代码会变成:

function [new_img] = aff3d(old_img, tform, range_x, range_y, range_z)
   [U, V, W] = ndgrid(range_x, range_y, range_z);
   xyz = [reshape(U,[],1)';reshape(V,[],1)';reshape(W,[],1)'];
   xyz = [xyz; ones(1,size(xyz,2))];

   tformInv = invert(tform);
   uvw = tformInv.T * xyz;
   % avoid homogeneous coordinate  
   uvw = uvw(1:3,:)';
   xi = reshape(uvw(:,1), length(range_x),length(range_y),length(range_z));
   yi = reshape(uvw(:,2), length(range_x),length(range_y),length(range_z));
   zi = reshape(uvw(:,3), length(range_x),length(range_y),length(range_z));
   old_img = single(old_img);
   new_img = interp3(old_img,yi,xi,zi,'linear');
   ii = find(isnan(new_img));
   if(~isempty(ii))
      new_img(ii) = 0;
   end
end

在你的代码中,你在old_img中插值到尝试找到已经扭曲的new_img。这意味着您要做的是使用从输出图像空间映射到输入图像空间的逆映射。您似乎使用点的正向映射来插值旧图像,这是不正确的。

In your code, you are interpolating within the old_img to try to find the new_img which has been warped. This implies that what you want to do is use the inverse mapping that maps from the output image space to the input image space. You appear to be interpolating your old image using the forward mapping of points, which is incorrect.

http://blogs.mathworks.com/steve/2006/04/28/spatial-transforms-forward-mapping/
http:// blogs。 mathworks.com/steve/2006/05/05/spatial-transformations-inverse-mapping/

我会用上面的链接来回顾前进与逆映射。 IMWARP使用逆映射。

I would use the above links to review forward vs. inverse mapping. IMWARP uses inverse mapping.

混淆的部分原因是,当人们想到几何变换时,他们通常会考虑点的映射方式。 new_image到new_image。出于这个原因,affine3d变换的T属性用正向映射表示。

Part of the reason for confusion is that when people think of geometric transformations, they generally think in terms of the forward mapping of how points map from the old_image to the new_image. For this reason, the "T" property of the affine3d transformation is phrased in terms of the forward mapping.

当几何变换需要在软件中实现时,它会更容易/更好地根据逆映射实现事物。这就是imwarp的作用,这就是为什么在尝试重现imwarp行为时需要反转转换的原因。如果你阅读关于逆映射的博客文章,这个算法正是IMWARP正在做的。

When geometric transformations need to be implemented in software, it's much easier/better to implement things in terms of the inverse mapping. That is what imwarp does, and that's why you need to invert the transformation while you are trying to reproduce the imwarp behavior. If you read the blog post on inverse mapping, this algorithm is exactly what IMWARP is doing.

在WorldLimits是的情况下,您需要解决的唯一问题是IMWARP在非默认坐标系统中的作用(使用非默认空间参考对象)不能被像素的离散网格整除。这种行为是任意的,没有正确的行为。 IMWARP行为是为了兑现请求的分辨率(PixelExtentInWorld)并在这种情况下略微调整世界限制。

The only wrinkle you will need to work through is what IMWARP does in non-default coordinate systems (using non-default spatial referencing objects) in the case where the WorldLimits are not evenly divisible by the discrete grid of pixels. This behavior is arbitrary, there is no "right" behavior. The IMWARP behavior is to to honor the requested resolution (PixelExtentInWorld) and to slightly adjust the world limits in this case.

这篇关于如何在Matlab中转移点?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆