Hi,
thanks for the great library. I really like the separation between the optimization logic (line search, state, etc.) and the problem definition.
I want to experiment with different preconditioned gradients for the optimization of neural networks, and compare them while keeping the rest of the optimization machinery fixed.
I’m wondering whether Optimistix supports using a custom gradient / search direction in place of the standard gradient computed from the objective. Specifically, is there a workaround or intended extension point that allows the optimizer to receive a preconditioned gradient (defined externally) while still using Optimistix for step updates, line search, and optimizer state? The goal is to implement and swap different preconditioners on my side, while relying on Optimistix for line search, step control, and iteration state.
If this is possible, what abstraction should be customized (objective, solver, or something else)? If not, is this outside the intended scope of Optimistix?
Thanks in advance.
Hi,
thanks for the great library. I really like the separation between the optimization logic (line search, state, etc.) and the problem definition.
I want to experiment with different preconditioned gradients for the optimization of neural networks, and compare them while keeping the rest of the optimization machinery fixed.
I’m wondering whether Optimistix supports using a custom gradient / search direction in place of the standard gradient computed from the objective. Specifically, is there a workaround or intended extension point that allows the optimizer to receive a preconditioned gradient (defined externally) while still using Optimistix for step updates, line search, and optimizer state? The goal is to implement and swap different preconditioners on my side, while relying on Optimistix for line search, step control, and iteration state.
If this is possible, what abstraction should be customized (objective, solver, or something else)? If not, is this outside the intended scope of Optimistix?
Thanks in advance.