Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using the hessian matrix in optimization #2437

Open
1 task done
andrewfowlie opened this issue Jan 10, 2024 · 2 comments
Open
1 task done

Using the hessian matrix in optimization #2437

andrewfowlie opened this issue Jan 10, 2024 · 2 comments
Labels
feat/enhancement New feature or request needs-triage Needs a maintainer to categorize and assign

Comments

@andrewfowlie
Copy link

Summary

Some optimization algorithms can make use of the hessian matrix, e.g., in scipy the methods Newton-CG, dogleg, trust-ncg, trust-krylov, trust-exact and trust-constr. The scipy API is like this:

scipy.optimize.minimize(fun, x0, args=(), method=None, jac=None, hess=None, hessp=None, bounds=None, constraints=(), tol=None, callback=None, options=None)

The hessian is not passed through the options dict; it has it's own kwarg hess.

It would be nice if pyhf passed the hessian automatically, since it is available cheaply with autodiff. I have tried passing it manually, but I can't get it to work. The interface with scipy minimize is like this

return minimizer(
func,
x0,
method=method,
jac=do_grad,
bounds=bounds,
constraints=constraints,
tol=tolerance,
options=dict(maxiter=maxiter, disp=bool(verbose), **solver_options),
)

so you can't input an extra hess kwarg argument by passing extra solver_options.

Additional Information

Code of Conduct

  • I agree to follow the Code of Conduct
@andrewfowlie andrewfowlie added feat/enhancement New feature or request needs-triage Needs a maintainer to categorize and assign labels Jan 10, 2024
@kratsg
Copy link
Contributor

kratsg commented Jan 11, 2024

One thought at the moment is you can subclass the scipy_optimizer via something like this for now:

class scipy_optimizer_hess(pyhf.optimize.scipy_optimizer):
    def _get_minimizer(self, *args, hess=None, **kwargs):
        return lambda *a, **k: scipy.optimize.minimize(*a, **k, hess=hess)

and then you can use it like so:

pyhf.set_backend(pyhf.tensorlib, scipy_optimizer_hess())

just to get a quick way of having it working for right now. This is probably something that needs a bit more thought as minuit doesn't support the hess, so we'd need a way to handle it seamlessly.

@andrewfowlie
Copy link
Author

Thanks, yes, that can work. It's trickier than I first thought, though, because of how it must interact with fixed_params.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat/enhancement New feature or request needs-triage Needs a maintainer to categorize and assign
Projects
None yet
Development

No branches or pull requests

2 participants