Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ci] prefer CPython in Windows test environment and use safer approach for cleaning up network (fixes #5509) #5510

Merged
merged 33 commits into from
Oct 7, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
1660982
[ci] avoid updating dependencies when installing plotting dependencies
jameslamb Sep 27, 2022
704aea4
comment out CI
jameslamb Sep 27, 2022
b77d123
explicitly pin to cpython
jameslamb Sep 27, 2022
982f07e
Revert "comment out CI"
jameslamb Sep 27, 2022
7ff172d
use new cluster for each dask test
shiyu1994 Sep 30, 2022
f7ccfbe
Revert "use new cluster for each dask test"
jameslamb Oct 4, 2022
da2b173
run free_network()
jameslamb Oct 4, 2022
c37bb19
avoid calling free_network() during Dask teardown
jameslamb Oct 4, 2022
110ed59
try sleeping to allow all fit() processes to finish (just for debugging)
jameslamb Oct 4, 2022
d648a15
get more logs
jameslamb Oct 4, 2022
3ec3902
even more logs
jameslamb Oct 4, 2022
9c9b2c9
only run the Dask tests
jameslamb Oct 4, 2022
0cb46cb
more debugging
jameslamb Oct 5, 2022
713739a
shorter timeout
jameslamb Oct 5, 2022
3bb0b12
add free_network() back
jameslamb Oct 5, 2022
3cbcaee
put timeout in a variable
jameslamb Oct 5, 2022
01203ec
use a variable and skip one test that seems problematic
jameslamb Oct 5, 2022
57e0b3e
comment out another test
jameslamb Oct 6, 2022
3db4742
cannot access booster_ if model isnt fitted yet
jameslamb Oct 6, 2022
dfddc73
fix attribute access
jameslamb Oct 6, 2022
0fad5b9
revert time_out changes
jameslamb Oct 6, 2022
c8684ff
revert all timeout-related changes and some test.sh logging
jameslamb Oct 6, 2022
656a2b9
revert garbage collection
jameslamb Oct 6, 2022
73eba73
try a much shorter socket time_out for problematic test
jameslamb Oct 6, 2022
d4be593
try removing other skip
jameslamb Oct 6, 2022
3972fdb
check if the issue is in closing the cluster
jameslamb Oct 6, 2022
a52fde3
maybe print debugging will save us
jameslamb Oct 6, 2022
a10c917
maybe the cluster is left in a weird state by the tests that raise er…
jameslamb Oct 6, 2022
07a349b
time out faster
jameslamb Oct 6, 2022
e6a3319
maybe which client you use matters
jameslamb Oct 6, 2022
00d423e
remove remaining workarounds, uncomment CI, skip one problematic test
jameslamb Oct 6, 2022
2284729
revert a few more unnecessary changes
jameslamb Oct 6, 2022
82fe5ef
try removing timeout on test_machines_should_be_used_if_provided()
jameslamb Oct 7, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
7 changes: 3 additions & 4 deletions .ci/test_windows.ps1
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ conda init powershell
conda activate
conda config --set always_yes yes --set changeps1 no
conda update -q -y conda
conda create -q -y -n $env:CONDA_ENV python=$env:PYTHON_VERSION ; Check-Output $?
conda create -q -y -n $env:CONDA_ENV "python=$env:PYTHON_VERSION[build=*cpython]" ; Check-Output $?
if ($env:TASK -ne "bdist") {
conda activate $env:CONDA_ENV
}
Expand All @@ -50,9 +50,8 @@ if ($env:TASK -eq "swig") {
Exit 0
}

conda install -q -y -n $env:CONDA_ENV cloudpickle joblib numpy pandas psutil pytest scikit-learn scipy ; Check-Output $?
# matplotlib and python-graphviz have to be installed separately to prevent conda from downgrading to pypy
conda install -q -y -n $env:CONDA_ENV matplotlib python-graphviz ; Check-Output $?
# re-including python=version[build=*cpython] to ensure that conda doesn't fall back to pypy
conda install -q -y -n $env:CONDA_ENV cloudpickle joblib matplotlib numpy pandas psutil pytest "python=$env:PYTHON_VERSION[build=*cpython]" python-graphviz scikit-learn scipy ; Check-Output $?

if ($env:TASK -eq "regular") {
mkdir $env:BUILD_SOURCESDIRECTORY/build; cd $env:BUILD_SOURCESDIRECTORY/build
Expand Down
7 changes: 4 additions & 3 deletions python-package/lightgbm/dask.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
import numpy as np
import scipy.sparse as ss

from .basic import _LIB, LightGBMError, _choose_param_value, _ConfigAliases, _log_info, _log_warning, _safe_call
from .basic import LightGBMError, _choose_param_value, _ConfigAliases, _log_info, _log_warning, _safe_call
from .compat import (DASK_INSTALLED, PANDAS_INSTALLED, SKLEARN_INSTALLED, Client, LGBMNotFittedError, concat,
dask_Array, dask_array_from_delayed, dask_bag_from_delayed, dask_DataFrame, dask_Series,
default_client, delayed, pd_DataFrame, pd_Series, wait)
Expand Down Expand Up @@ -302,8 +302,8 @@ def _train_part(
if eval_class_weight:
kwargs['eval_class_weight'] = [eval_class_weight[i] for i in eval_component_idx]

model = model_factory(**params)
try:
model = model_factory(**params)
if is_ranker:
model.fit(
data,
Expand Down Expand Up @@ -332,7 +332,8 @@ def _train_part(
)

finally:
_safe_call(_LIB.LGBM_NetworkFree())
if getattr(model, "fitted_", False):
model.booster_.free_network()

if n_evals:
# ensure that expected keys for evals_result_ and best_score_ exist regardless of padding.
Expand Down
1 change: 1 addition & 0 deletions tests/python_package_test/test_dask.py
Original file line number Diff line number Diff line change
Expand Up @@ -1504,6 +1504,7 @@ def f(part):
@pytest.mark.parametrize('task', tasks)
@pytest.mark.parametrize('output', data_output)
def test_training_succeeds_even_if_some_workers_do_not_have_any_data(task, output, cluster):
pytest.skip("skipping due to timeout issues discussed in https://github.com/microsoft/LightGBM/pull/5510")
if task == 'ranking' and output == 'scipy_csr_matrix':
pytest.skip('LGBMRanker is not currently tested on sparse matrices')

Expand Down