Replies: 1 comment
-
Could you investigate that you really get 0.0 or instead NaN? Somehow a losses of zero would mean that you get perfect classification which apparently is not the case. So I would wonder if there not invalid gradients or values at the iteration where you observe the drop. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
Does anyone know why the model vanishes loss and accuracy?
I implemented 5-folds cross validation and trained the HistGradientBoostingClassifier models for 300 iterations.
Build model:
Visualization the loss and accuracy of 300 iterations
This is the result
You can notice that the curves of both train, test, and validation loss/accuracy immediately dropping down after a certain iteration.
Very appreciate if you share your thoughts. Thank you.
Beta Was this translation helpful? Give feedback.
All reactions