Strange dependency between GPU & CPU filters

Do GPU filters in ITK have some hidden dependency on corresponding CPU filters?

I’m seeing this unexpected behavior in my application where a GPU filter usage aborts my application with an exception (the kind that cannot be caught in a try/catch block) unless I add an unused, corresponding CPU filter in the same scope. For example, the following code snippet will abort. I have confirmed with debugging that the abort happens in the call to Update().

// Various typedefs
typedef itk::Image<short, 3> imageType;
typedef itk::Image<double, dim> imageDOUBLEType;
typedef itk::GPUImage<type, dim> gpuImageType; 
typedef itk::CastImageFilter<imageType, gpuImageType>  cpu2gpuCastFilterType;
typedef itk::GPUImage<double, 3> gpuImageDOUBLEType;
typedef itk::GPUGradientAnisotropicDiffusionImageFilter<gpuImageType, gpuImageDOUBLEType> gpuGradientDiffusionFilterType;
typedef itk::CastImageFilter<gpuImageDOUBLEType, imageDOUBLEType> gpu2cpuCastFilterDOUBLEType;

// Allocate pointers
gpuGradientDiffusionFilterType::Pointer gpuGradient = gpuGradientDiffusionFilterType::New();
cpu2gpuCastFilterType::Pointer cpu2gpu = cpu2gpuCastFilterType::New();
gpu2cpuCastFilterDOUBLEType::Pointer gpu2cpu = gpu2cpuCastFilterDOUBLEType::New();

// Pipeline the filters and run the pipeline
cpu2gpu->SetInput(<Pointer to an image of type imageType>);
gpuGradient->SetInput(cpu2gpu->GetOutput());
gpuGradient->SetNumberOfIterations(<iterations>);
gpuGradient->SetTimeStep(<timeStep>);
gpuGradient->SetConductanceParameter(<conductance>);
gpu2cpu->SetInput(gpuGradient->GetOutput());
gpu2cpu->Update();

However, if I add the a CPU filter as shown below to the above snippet, the above snippet will run fine and produce expected results.

typedef itk::GradientAnisotropicDiffusionImageFilter<imageType, imageDOUBLEType> gradientDiffusionFilterType;
gradientDiffusionFilterType::Pointer gradient = gradientDiffusionFilterType::New();

This behavior is not specific to the gradient smoothing filter. I’m seeing the same behavior in the itk::DiscreteGaussianImageFilter vs. itk::GPUDiscreteGaussianImageFilter pair as well.

Any idea as to what is happening here? Am I missing a step in setting up the GPU filters, e.g., registering them before use?

Thanks.

@LucasGandel or @simon.rit might be able to advise.

I have never used this filter but it is not an expected behavior as far as I know. To debug this, I would track down what static member is initialized by the CPU filter’s constructor and not by the GPU filter’s constructor…

2 Likes