Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix set of hyperparameters when optimizing #632

Open
mo-fu opened this issue Oct 26, 2022 · 1 comment
Open

Fix set of hyperparameters when optimizing #632

mo-fu opened this issue Oct 26, 2022 · 1 comment

Comments

@mo-fu
Copy link
Contributor

mo-fu commented Oct 26, 2022

When optimizing hyperparameters I may want to fix a specific parameter.
I.e., do I want to train bonsai or parabel variants of omikuji.

This could possibly configurin by not optimizing hyperparameters defined int the project.
I had a look at Optuna a while back they don't allow this by default. The option I see is to wrap the trial in a proxy to handle this.

@osma
Copy link
Member

osma commented Jan 30, 2023

This makes sense, and I've had the same thought for example about fastText (where hyperopt isn't supported yet) - sometimes it would make sense to fix e.g. loss=hs when performing hyperparameter optimization, because the other options would be way too slow.

However, I'm not sure if "not optimizing hyperparameters defined in the project" is the best way - how about instead being able to specify the fixed parameters on the command line? This could be done using the --backend-param option, which is already implemented for e.g. the suggest command.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants