# Displacement field transform in 3D

temp is a forward 3D displacement field of size DxMxNx3.

`displacement_image = sitk.GetImageFromArray(np.double(temp),sitk.sitkVectorFloat64)`

`tx = sitk.DisplacementFieldTransform(displacement_image)`

`tx.SetInterpolator(sitk.sitkLinear)`

I find the inverse transform using the code below:
`sitk_inverse_displacement_field = sitk.InvertDisplacementField(tx.GetDisplacementField())`

`tx1 = sitk.DisplacementFieldTransform(sitk_inverse_displacement_field)`

`tx1.SetInterpolator(sitk.sitkLinear)`

Using the inverse transform and a resampler I transformed a 3D volume.
using the same inverse transform i also transform a set of K 3D points (X,Y,Z) spread uniformly across the surface of the volume.

`px, py, pz =tx1.TransformPoint((x_points,y_points,z_points))`

When i plot the transformed volume and points on paraview, they don’t align. can you please help in explaining what i am doing wrong.

Hello @mokshagna_sai_teja,

Two things to consider, Do you need to transform the points? In which coordinate system are they defined?

In the context of registration we have a `fixed` coordinate system and a `moving` coordinate system. The result of registration is a transform which maps points from the `fixed` to `moving`.

We then usually resample the `moving` image onto the `fixed` image grid using this transformation. If the original points are in the `fixed` coordinate system, there is no need to transform them after resampling. If on the other hand the original points are in the `moving` coordinate system, after resampling you also want to map them onto the resampled image, you will need to use the inverse transform.

My setup is i want to apply the same 3D transform to both the volume and points on surface( of the same volume). So both volume and point are moving w.r.t their initial coordinate system, they are both registered initially and after applying the transformation should be registered as well, which is not happening.

For volume i need to used SITK.Resample(volume,transform) which i did, for point transform i used a for loop to transform each point (x,y,x) T= transform.TransformPoint(x,y,z) .
I have also tried inverseTransform.TransformPoint(x,y,x).

Above transform can be any 3D transform affine or displacement. When i tried with affine transform in 3D with translation(-11,-25,-25) both volume and points are misaligned.

I think i am missing something but i don’t know what and i don’t understand why it should behave differently.

As I said in the explanation above, if you resampled using the transform `tx` and the points are in the original volume’s coordinate system you will need to use the inverse transform to map the points, so pseudo-code is:

``````volume = read_volume
resample_volume = sitk.Resample(volume, transform)
iterate_over_points:
point_transformed = transform_inverse.TransformPoint(pnt)
``````

In ITK/SimpleITK resampling is done using an inverse mapping. For more details on why, please see this talk describing the basics of image warping (this is a random source, there are many others on the internet explaining how to do image warping and why use an inverse mapping).

2 Likes

Another source of potential error is that ParaView is not designed to work with medical images. One of the most important limitation is that ParaView ignores image axis directions (it always assumes that IJK to LPS direction matrix is identity), so all images that are not happen to be axis-aligned will be loaded into incorrect image position. Also, you cannot apply transforms to images. Actually, you cannot even load ITK transforms. So, while ParaView works really well for what it is designed for (general engineering and scientific visualization), it lacks even the most basic features that are essential for medical image computing.

3D Slicer has all these features and a lot more. You can load a images correctly, you can load displacement field (or bspline, linear, thin-plate-spline, transforms, or any combination of these in a composite transform) from ITK files. You can apply the transforms to any objects (images, meshes, point lists, curves, etc.) - the transform is automatically smoothly extrapolated (necessary for robust inverse computation) and inverted as needed in real-time (no need precompute a dense displacement field and invert it). You can also visualize the transform itself in slice and 3D views (using glyphs, grid, isosurface, etc.).

You can also run ITK-Python and SimpleITK code directly in Slicer and see all the transformed objects moving in real-time as you adjust the transformation parameters. You can also use 3D Slicer as a Jupyter notebook kernel. If you want to keep using your current Python environment then you can send your images and transform for visualization to a running 3D Slicer instance using slicerio Python package.

3 Likes

In the Displacement field post I’m using inverse transform for transforming the points but that still doesn’t work. I am still getting misalignment between 3D volume and points. My 3D volume is an NRRD file

Hello @mokshagna_sai_teja,

Without seeing the data it is hard to guess what is going wrong. Can you share the original image, transform and points?

The file format on disk, nrrd, is not relevant.