Hi,
I just followed the tutorial on 61_Registration_Introduction_Continued and noted, that the convergence-speed does not change when I modify the learningRate of the optimizer
I changed the value to learningRate=0 and learningRate=10000, but the resulting metric at each step was almost identical. Do I get something wrong here?
For the learning rate to have an effect you will have to change the default parameter value for the estimateLearningRate in the SetOptimizerAsGradientDesent method to sitk.ImageRegistrationMethod.Never. Otherwise it has a default value of Once and the learning rate is estimated once per pyramid level and the user supplied value is ignored.
I think this issue could potentially confuse a lot of user, as the learning Rate is explicitly specified in the example mentioned above, but it has no effect. Maybe the default should be “Never” when learningRate is set by the user?!