Set up different learning rates for to-be-optimized elements

Hello!

I’m using RegularStepGradientDescentOptimizerv4 to optimize a rigid registration. I want to set up a higher learning rate for translation parameters and a lower learning rate for rotation parameters. Is there any way to do this?

It looks optimizer->SetLearningRate() is equally for all the elements of the parameter vectors.

Thanks!
-Ray

Hello Ray! :sunny:

SetWeights() as documented in the base optimizer class may be what you are looking for.

Hope this helps,
Matt