Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Python 3.11 #9708

Merged
merged 15 commits into from Dec 16, 2022
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/tests.yml
Expand Up @@ -23,7 +23,7 @@ jobs:
fail-fast: false
matrix:
os: ["windows-latest", "ubuntu-latest", "macos-latest"]
python-version: ["3.8", "3.9", "3.10"]
python-version: ["3.8", "3.9", "3.10", "3.11"]
exclude:
- os: "macos-latest"
python-version: "3.8"
Expand Down
77 changes: 77 additions & 0 deletions continuous_integration/environment-3.11.yaml
@@ -0,0 +1,77 @@
# This job includes coverage
name: test-environment
channels:
- conda-forge
- nodefaults
dependencies:
# required dependencies
- python=3.11
- packaging
- numpy
- pandas
# test dependencies
- pre-commit
- pytest
- pytest-cov
- pytest-rerunfailures
- pytest-timeout
- pytest-xdist
- moto
- flask
- fastparquet>=0.8.0
- h5py
- pytables
# - zarr
# `tiledb-py=0.17.5` lead to strange seg faults in CI, However 0.18 is needed for 3.11
# We should unpin when possible.
graingert marked this conversation as resolved.
Show resolved Hide resolved
# https://github.com/dask/dask/pull/9569
- tiledb-py
# - pyspark
- tiledb>=2.5.0
- xarray
- fsspec
- sqlalchemy>=1.4.0
# - pyarrow needs 10+ for python3.11, conda-forge only has v9
graingert marked this conversation as resolved.
Show resolved Hide resolved
- coverage
- jsonschema
# # other -- IO
- boto3
- botocore
# Temporary restriction until https://github.com/dask/distributed/issues/7173 is resolved
- bokeh
- httpretty
- aiohttp
# # Need recent version of s3fs to support newer aiobotocore versions
# # https://github.com/dask/s3fs/issues/514
- s3fs>=2021.8.0
- click
- cloudpickle
- crick
- cytoolz
- distributed
- ipython
- ipycytoscape
- lz4
# https://github.com/numba/numba/issues/8304
# - numba # not supported on 3.11
- partd
- psutil
- requests
- scikit-image
- scikit-learn
- scipy
- toolz
- python-snappy
# - sparse needs numba
- cachey
- python-graphviz
- python-xxhash
- mmh3
- jinja2
- pip
# The nightly pyarrow / arrow-cpp packages currently don't install with latest
# protobuf / abseil, see https://github.com/dask/dask/issues/9449
- libprotobuf=3.19
- pip:
- git+https://github.com/graingert/distributed@python-311
graingert marked this conversation as resolved.
Show resolved Hide resolved
- pyarrow # pyarrow on conda-forge is 9.0 needs 10+ for python3.11
graingert marked this conversation as resolved.
Show resolved Hide resolved
3 changes: 3 additions & 0 deletions dask/dataframe/io/tests/test_parquet.py
Expand Up @@ -661,6 +661,7 @@ def write_partition(df, i):
assert_eq(df, ddf2, check_index=False)


@PYARROW_MARK
@pytest.mark.xfail(
not PANDAS_GT_130,
reason=(
Expand Down Expand Up @@ -3005,6 +3006,7 @@ def test_chunksize_aggregate_files(tmpdir, write_engine, read_engine, aggregate_
assert_eq(df1[["c", "d"]], df2[["c", "d"]], check_index=False)


@PYARROW_MARK
@pytest.mark.parametrize("metadata", [True, False])
@pytest.mark.parametrize("chunksize", [None, 1024, 4096, "1MiB"])
def test_chunksize(tmpdir, chunksize, engine, metadata):
Expand Down Expand Up @@ -3998,6 +4000,7 @@ def test_metadata_task_size(tmpdir, engine, write_metadata_file, metadata_task_s
assert_eq(ddf2b, ddf2c)


@PYARROW_MARK
@pytest.mark.parametrize("partition_on", ("b", None))
def test_extra_file(tmpdir, engine, partition_on):
# Check that read_parquet can handle spark output
Expand Down