Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change default parquet compression format from Snappy to LZ4 #10647

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
10 changes: 5 additions & 5 deletions dask/dataframe/io/parquet/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -698,7 +698,7 @@ def to_parquet(
df,
path,
engine="auto",
compression="snappy",
compression="lz4",
write_index=True,
append=False,
overwrite=False,
Expand Down Expand Up @@ -729,10 +729,10 @@ def to_parquet(
engine : {'auto', 'pyarrow', 'fastparquet'}, default 'auto'
Parquet library to use. Defaults to 'auto', which uses ``pyarrow`` if
it is installed, and falls back to ``fastparquet`` otherwise.
compression : string or dict, default 'snappy'
Either a string like ``"snappy"`` or a dictionary mapping column names
to compressors like ``{"name": "gzip", "values": "snappy"}``. Defaults
to ``"snappy"``.
compression : string or dict, default 'lz4'
Either a string like ``"lz4"`` or a dictionary mapping column names
to compressors like ``{"name": "gzip", "values": "lz4"}``. Defaults
to ``"lz4"``.
write_index : boolean, default True
Whether or not to write the index. Defaults to True.
append : bool, default False
Expand Down