Skip to content

Commit

Permalink
Delay breaking changes to 1.6. (#7420)
Browse files Browse the repository at this point in the history
The patch is too big to be backported.
  • Loading branch information
trivialfis committed Nov 12, 2021
1 parent cb68560 commit 97d7582
Show file tree
Hide file tree
Showing 3 changed files with 14 additions and 13 deletions.
3 changes: 2 additions & 1 deletion doc/tutorials/custom_metric_obj.rst
Expand Up @@ -21,6 +21,7 @@ concepts should be readily applicable to other language bindings.
.. note::

* The ranking task does not support customized functions.
* Breaking change was made in XGBoost 1.6.

In the following two sections, we will provide a step by step walk through of implementing
``Squared Log Error(SLE)`` objective function:
Expand Down Expand Up @@ -270,7 +271,7 @@ Scikit-Learn Interface


The scikit-learn interface of XGBoost has some utilities to improve the integration with
standard scikit-learn functions. For instance, after XGBoost 1.5.1 users can use the cost
standard scikit-learn functions. For instance, after XGBoost 1.6.0 users can use the cost
function (not scoring functions) from scikit-learn out of the box:

.. code-block:: python
Expand Down
16 changes: 8 additions & 8 deletions python-package/xgboost/sklearn.py
Expand Up @@ -199,7 +199,7 @@ def inner(y_score: np.ndarray, dmatrix: DMatrix) -> Tuple[str, float]:
eval_metric : Optional[Union[str, List[str], Callable]]
.. versionadded:: 1.5.1
.. versionadded:: 1.6.0
Metric used for monitoring the training result and early stopping. It can be a
string or list of strings as names of predefined metric in XGBoost (See
Expand Down Expand Up @@ -239,7 +239,7 @@ def inner(y_score: np.ndarray, dmatrix: DMatrix) -> Tuple[str, float]:
early_stopping_rounds : Optional[int]
.. versionadded:: 1.5.1
.. versionadded:: 1.6.0
Activates early stopping. Validation metric needs to improve at least once in
every **early_stopping_rounds** round(s) to continue training. Requires at least
Expand Down Expand Up @@ -855,11 +855,11 @@ def fit(
Validation metrics will help us track the performance of the model.
eval_metric : str, list of str, or callable, optional
.. deprecated:: 1.5.1
.. deprecated:: 1.6.0
Use `eval_metric` in :py:meth:`__init__` or :py:meth:`set_params` instead.
early_stopping_rounds : int
.. deprecated:: 1.5.1
.. deprecated:: 1.6.0
Use `early_stopping_rounds` in :py:meth:`__init__` or
:py:meth:`set_params` instead.
verbose :
Expand All @@ -881,7 +881,7 @@ def fit(
`exact` tree methods.
callbacks :
.. deprecated: 1.5.1
.. deprecated: 1.6.0
Use `callbacks` in :py:meth:`__init__` or :py:methd:`set_params` instead.
"""
evals_result: TrainingCallback.EvalsLog = {}
Expand Down Expand Up @@ -1693,11 +1693,11 @@ def fit(
pair in **eval_set**.
eval_metric : str, list of str, optional
.. deprecated:: 1.5.1
.. deprecated:: 1.6.0
use `eval_metric` in :py:meth:`__init__` or :py:meth:`set_params` instead.
early_stopping_rounds : int
.. deprecated:: 1.5.1
.. deprecated:: 1.6.0
use `early_stopping_rounds` in :py:meth:`__init__` or
:py:meth:`set_params` instead.
Expand Down Expand Up @@ -1727,7 +1727,7 @@ def fit(
`exact` tree methods.
callbacks :
.. deprecated: 1.5.1
.. deprecated: 1.6.0
Use `callbacks` in :py:meth:`__init__` or :py:methd:`set_params` instead.
"""
# check if group information is provided
Expand Down
8 changes: 4 additions & 4 deletions python-package/xgboost/training.py
Expand Up @@ -80,7 +80,7 @@ def train(
<https://xgboost.readthedocs.io/en/latest/tutorials/custom_metric_obj.html>`_ for
details.
feval :
.. deprecated:: 1.5.1
.. deprecated:: 1.6.0
Use `custom_metric` instead.
maximize : bool
Whether to maximize feval.
Expand Down Expand Up @@ -132,7 +132,7 @@ def train(
custom_metric:
.. versionadded 1.5.1
.. versionadded 1.6.0
Custom metric function. See `Custom Metric
<https://xgboost.readthedocs.io/en/latest/tutorials/custom_metric_obj.html>`_ for
Expand Down Expand Up @@ -392,7 +392,7 @@ def cv(params, dtrain, num_boost_round=10, nfold=3, stratified=False, folds=None
details.
feval : function
.. deprecated:: 1.5.1
.. deprecated:: 1.6.0
Use `custom_metric` instead.
maximize : bool
Whether to maximize feval.
Expand Down Expand Up @@ -432,7 +432,7 @@ def cv(params, dtrain, num_boost_round=10, nfold=3, stratified=False, folds=None
Shuffle data before creating folds.
custom_metric :
.. versionadded 1.5.1
.. versionadded 1.6.0
Custom metric function. See `Custom Metric
<https://xgboost.readthedocs.io/en/latest/tutorials/custom_metric_obj.html>`_ for
Expand Down

0 comments on commit 97d7582

Please sign in to comment.