Replies: 1 comment
-
Now tracked in #24786 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Preliminaries
I'm a bit confused by
GaussianProcess{Regressor,Classifier}
's internal hyperparameter representation and the nature of thetheta
argument of thelog_marginal_likelihood()
method.The doc strings of these methods say
However, the GP's internal hyper optimizer code path seems to work with
log(theta)
.The value of the
kernel_.theta
attribute is actuallylog(theta)
. And indeed, thetheta
getter returnslog(theta)
, so they are not stored in log format but only returned that way.The getter's docs say
OK.
Question
When calling
log_marginal_likelihood(theta)
, the code setskernel.theta = theta
, which calls thetheta
setter which does essentiallyexp(theta)
. So from that I'd assume that we need to calllog_marginal_likelihood(log(theta))
. Indeed, continuing with above's codeshows that we need to pass
log(theta)
tolog_marginal_likelihood()
. Also when trying to plotlog_marginal_likelihood
on a grid of hyperparameters, I only get what looks like correct results if I pass inlog(theta)
. If that is true, should the documentation of thelog_marginal_likelihood()
methods be adapted accordingly?Thanks.
Beta Was this translation helpful? Give feedback.
All reactions