Hi all. I have been investigating the ITK v4 registration framework, and have a few questions/comments for discussion.
There seem to be some v3 functionality missing in v4. For example, no FRPROptimizer, no MutualInformationImageToImageMetric, no NormalizedMutualInformationHistogramImageToImageMetric. Are these omissions intentional or accidental?
Is there a way to use ImageRegistrationMethodv4 for multiple objectives? For example, finding a transform that optimizes both image similarity and pointset distnances, or image similarity of multiple image planes.
Does ITK intend to maintain both registration frameworks? Will one or the other be phased out or deprecated?
Not all optimizers had been refactored into a v4 version.
I think that Mattes MI metric is usually used, so the other flavors of MI have been left out of refactoring during v4.
Both of these were probably done due to cost/benefit reasons.
ImageRegistrationMethodv4 optimizes a single objective function. You can define your own custom metric function which will invoke, then weigh and combine contributions by these other things.
Both v3 and v4 registration frameworks are planned to be maintained.
Maybe someone who participated in the development of v4 could pitch in with more insight? @hjmjohnson@Stephen_Aylward
Thank you. This looks promising! Apparently one needs to use ObjectToObjectMultiMetricv4 as the metric, then add fixed and moving objects using an index argument, like SetFixedImage(3,image) or SetMovingPointSet(3,pointset).
Trying to write code which will be agnostic to v3/v4 registration framework is hard, due to many minor differences. I would always choose one and stick with it.
This might be something close to a specification for events.
Thanks Dzenan. There appears to be a bug that the RSG v4 optimizer updates the search position before emitting the iteration event. It would be good to have a specification on the events.
I’m sorry that comment was a little bit terse. I am still learning the code. But I am suspicious that the problem is not limited to incorrect parameter values at the time the event is emitted. Consider GradientDescentOptimizerv4Template::ResumeOptimization(). On line 100, the value and gradient are calculated. Then, on line 144, the current position is updated. Thus, by inspection:
The “best parameters” on line 150 mixes the current value with the next position, so if m_ReturnBestParametersAndValue is true, the wrong position will be returned.
If the convergence test on line 83 is positive, and if m_ReturnBestParametersAndValue is false, once again the wrong position will be returned.
Please advise. If you agree this is a bug I will file a ticket.
That sounds like a bug. Submitting a bug report is good, making a PR that fixes it even better And it is probably not that hard, now that you have dived into the code.
There are several v4 Optimizers that are based on the gradient descent optimizer, that should maintain consistent behavior before them.
IMHO, the next step to to write a test which checks/demonstrates the expected behavior. Likely this will need a mock metric class. If the test is written generically it could be used to test the other optimizers too.
As I understand the issue it is just with the optimizer, that not whole registration framework.
I would look at some of the tests in “ITK/Modules/Numerics/Optimizersv4/test” such as:
This one has a mock metric to create a something closer to a unit test of the optimizer as opposed to a registration framework integration test. I don’t find this test to be very clean as it print too much useless information to std::cout and does not do enough comparisons and checks.
I find that the GTest library facilitates writing tests with more validation and checks.
If recommend copying some of that old optimizer test work to a new one to tests the events and demonstrate the expected behavior.
You are editing the wrong one, e.g. in the wrong directory (in case you have multiple copies of ITK)? Or maybe you disabled tests in CMake configuration?