I hope that’s the correct category.
I wanted to ask if someone can give me a hint or info on if ITK’s ResampleImageFilter and ImageFileWriter are compatible with dask and if someone has tested that so far. This blog entry explains an example using a deconvolution filter, but I doubt that resampling and especially writing are not comparable with such filters.
Thank you. I read the release notes, but I was unsure if those excellent improvements would also affect resampling an image or writing. Those changes are already live in the latest version; is that correct?
Thank you for the hint about the pythonic API. I sometimes get a bit confused about which route to go, switching between SimpleITK, ITK and also C++ and Python.
Hello @matt.mccormick, sorry to tag you again here. I would have a question concerning the parallel processing of chunks of one single image loaded with ITK. Would it make sense to take this example and process multiple chunks of one image in parallel by using dask? Especially the resampling that I mentioned earlier, and maybe also writing.
Or, is parallel processing where it makes sense already handled within ITK when working on single images? It would be interesting to have some more insight into that. Maybe there is documentation that you can suggest?
Again, sorry for the second tag, but I got very curious after trying ITK with dask.
I appreciate any help you can provide.
Most ITK filters are already parallel. That example is meant to show some custom per/chunk processing, while also reading/writing only one chunk at a time. The file format must support chunked reading/writing for this to work as intended.
Excellent that you like the idea and are willing to write a post about it! I am looking forward to it.
In the meantime, I tried to implement a resampler with dask and resorted to scipy as a resampling strategy. The sample code is below. Maybe someone might find it useful.
def resample_scipy(image_chunk, input_spacing: list, output_spacing: list = [1.5, 1.5, 1.5]):
input_size = image_chunk.shape
zoom_factors = [(input_spacing[i] / output_spacing[i]) for i in reversed(range(len(input_size)))]
new_array = zoom(image_chunk, zoom_factors, order=3)
return new_array
if __name__ == '__main__':
working_directory = './images'
input_image_path = os.path.join(working_directory, 'ct.nii.gz')
# Read the image
itk_image = itk.imread(input_image_path)
# Calculate chunk size and set up dask computation
input_spacing = itk.spacing(itk_image)
input_size = itk.size(itk_image)
chunk_size = (input_size[2] / 4, input_size[0] / 4, input_size[1] / 4)
print(chunk_size)
image_dask = da.from_array(itk_image, chunks=chunk_size)
result = da.map_blocks(resample_scipy, image_dask, input_spacing)
# Invoke dask computation and write image
output_image_path = os.path.join(working_directory, 'dask_scipy_resampled_itk_ct.nii.gz')
itk.imwrite(itk.image_from_array(result), os.path.join(output_image_path))
# If the image needs to be further processed, make sure to convert the dask array to numpy array to not invoke the computation again