Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[jvm-packages] model loading should be compatible with old models #7845

Merged
merged 2 commits into from Apr 27, 2022

Conversation

wbo4958
Copy link
Contributor

@wbo4958 wbo4958 commented Apr 26, 2022

For the parameters not used, XGBoost should delete them, instead of
keeping it, which will cause some issue for the old model.

For example, the kill_spark_context_on_worker_failure is deleted,
but when loading the model saved in 1.6.0, (in which has
kill_spark_context_on_worker_failure parameter).
XGBoost will throw an exception

For the parameters not used, XGBoost should delete them, instead of
keeping it, which will cause some issue for the old model.

For example, the kill_spark_context_on_worker_failure is deleted,
but when loading the model saved in 1.6.0, (in which has
kill_spark_context_on_worker_failure parameter).
XGBoost will throw an exception
@wbo4958
Copy link
Contributor Author

wbo4958 commented Apr 26, 2022

@trivialfis , this PR should also be back-ported to 1.6.1

Thx

@trivialfis trivialfis merged commit a94e1b1 into dmlc:master Apr 27, 2022
@wbo4958 wbo4958 deleted the model-compatible branch April 28, 2022 03:24
trivialfis pushed a commit to trivialfis/xgboost that referenced this pull request Apr 29, 2022
trivialfis added a commit that referenced this pull request Apr 29, 2022
* [jvm-packages] move the dmatrix building into rabit context (#7823)

This fixes the QuantileDeviceDMatrix in distributed environment.

* [doc] update the jvm tutorial to 1.6.1 [skip ci] (#7834)

* [Breaking][jvm-packages] Use barrier execution mode (#7836)

With the introduction of the barrier execution mode. we don't need to kill SparkContext when some xgboost tasks failed. Instead, Spark will handle the errors for us. So in this PR, `killSparkContextOnWorkerFailure` parameter is deleted.

* [doc] remove the doc about killing SparkContext [skip ci] (#7840)

* [jvm-package] remove the coalesce in barrier mode (#7846)

* [jvm-packages] Fix model compatibility (#7845)

* Ignore all Java exceptions when looking for Linux musl support (#7844)

Co-authored-by: Bobby Wang <wbo4958@gmail.com>
Co-authored-by: Michael Allman <msa@allman.ms>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants