Skip to content

Is it possible to plug a custom (preconditioned) gradient into Optimistix? #198

@IvanBioli

Description

@IvanBioli

Hi,

thanks for the great library. I really like the separation between the optimization logic (line search, state, etc.) and the problem definition.

I want to experiment with different preconditioned gradients for the optimization of neural networks, and compare them while keeping the rest of the optimization machinery fixed.

I’m wondering whether Optimistix supports using a custom gradient / search direction in place of the standard gradient computed from the objective. Specifically, is there a workaround or intended extension point that allows the optimizer to receive a preconditioned gradient (defined externally) while still using Optimistix for step updates, line search, and optimizer state? The goal is to implement and swap different preconditioners on my side, while relying on Optimistix for line search, step control, and iteration state.

If this is possible, what abstraction should be customized (objective, solver, or something else)? If not, is this outside the intended scope of Optimistix?

Thanks in advance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions