Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fuse split evaluation kernels #8026

Merged
merged 18 commits into from Jul 5, 2022
Merged

Fuse split evaluation kernels #8026

merged 18 commits into from Jul 5, 2022

Conversation

RAMitchell
Copy link
Member

Benchmarking to follow.

@RAMitchell
Copy link
Member Author

Depth 8

dataset master fuse
airline 91.41312011 90.88661192
bosch 16.91872779 12.88504644
covtype 24.87996562 18.01187677
epsilon 71.07525846 46.48386218
fraud 1.357841348 1.315704659
higgs 19.10689613 17.19260674
year 9.428669604 7.047273015

Depth 20

dataset master fuse
airline 3828.160733 2379.316679
bosch 86.84837265 38.56913418
covtype 149.1225165 83.71305959
epsilon 971.4417802 441.8951659
fraud 1.617356188 1.512526119
higgs 2251.212349 1477.053153
year 2124.020786 1562.211386

Depth 8 lossguide

dataset master fuse
airline 99.3539845 95.5830758
bosch 17.84664539 14.81773844
covtype 28.07889959 37.00055545
epsilon 70.23383995 53.28981574
fraud 1.357731754 2.032787981
higgs 20.90421377 24.37834173
year 10.81165188 13.12019443

@RAMitchell RAMitchell marked this pull request as ready for review July 4, 2022 09:58
@RAMitchell
Copy link
Member Author

One of the dask early stopping tests started to fail but this is only due to slight changes in split evaluation. The validation score continues to decrease beyond maximum number of iterations.

This PR has resulted in very small changes to gbm-bench training loss results, some better, some worse. I attribute this to floating point ordering. In particular, changing the block size for split evaluation from 256->32 has an impact.

Copy link
Member

@trivialfis trivialfis left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me!

@RAMitchell RAMitchell merged commit 794cbaa into dmlc:master Jul 5, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants