Image Registration: Learning rate for optimizer has no influence

I just followed the tutorial on 61_Registration_Introduction_Continued and noted, that the convergence-speed does not change when I modify the learningRate of the optimizer

I changed the value to learningRate=0 and learningRate=10000, but the resulting metric at each step was almost identical. Do I get something wrong here?


Hello @sr1,

For the learning rate to have an effect you will have to change the default parameter value for the estimateLearningRate in the SetOptimizerAsGradientDesent method to sitk.ImageRegistrationMethod.Never. Otherwise it has a default value of Once and the learning rate is estimated once per pyramid level and the user supplied value is ignored.

1 Like

Yes, that’s the solution! Thanks a alot.

reg_method.SetOptimizerAsGradientDescentLineSearch(1.5, estimateLearningRate = sitk.ImageRegistrationMethod.Never, ...)

I think this issue could potentially confuse a lot of user, as the learning Rate is explicitly specified in the example mentioned above, but it has no effect.
Maybe the default should be “Never” when learningRate is set by the user?!

Hello @sr1,

Notebook documentation has been added to explain the setting.

Changing the default behavior will not be backwards compatible so we may possibly change it in the next major toolkit release.

1 Like

Great, thank you! :slight_smile:

Should we report a warning or error if the learning rate is explicitly supplied in a case where it is ignored?

1 Like

@Lee_Newberg yes, I think we should.