Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

s2: Add Dictionary support. #685

Merged
merged 21 commits into from
Feb 26, 2023
Merged

s2: Add Dictionary support. #685

merged 21 commits into from
Feb 26, 2023

Conversation

klauspost
Copy link
Owner

@klauspost klauspost commented Oct 27, 2022

Compression Improvement

github_users_sample_set

From https://github.com/facebook/zstd/releases/tag/v1.1.3

With 64K dictionary trained with zstd:

9114 files, 7484607 bytes input:

Default Compression: 3362023 (44.92%) -> 921524 (12.31%)
Better: 3083163 (41.19%) -> 873154 (11.67%)
Best: 3057944 (40.86%) -> 785503 bytes (10.49%)

Go Sources

8912 files, 51253563 bytes input:

Default: 22955767 (44.79%) -> 19654568 (38.35%)
Better: 20189613 (39.39%) -> 16289357 (31.78%)
Best: 19482828 (38.01%) -> 15184589 (29.63%)

Status:

  • Format Specification
  • Reference Decoder
  • Encoders (default, better, best)
  • Roundtrip tests
  • Fuzz tests

Non-goals

There will be no assembly for initial release. Also some compression may still be left on the table.

There will be no Snappy implementation, since it will be incompatible anyway.

DOCUMENTATION

Note: S2 dictionary compression is currently at an early implementation stage, with no assembly for
neither encoding nor decoding. Performance improvements can be expected in the future.

Adding dictionaries allow providing a custom dictionary that will serve as lookup in the beginning of blocks.

The same dictionary must be used for both encoding and decoding.
S2 does not keep track of whether the same dictionary is used,
and using the wrong dictionary will most often not result in an error when decompressing.

Blocks encoded without dictionaries can be decompressed seamlessly with a dictionary.
This means it is possible to switch from an encoding without dictionaries to an encoding with dictionaries
and treat the blocks similarly.

Similar to zStandard dictionaries,
the same usage scenario applies to S2 dictionaries.

Training works if there is some correlation in a family of small data samples. The more data-specific a dictionary is, the more efficient it is (there is no universal dictionary). Hence, deploying one dictionary per type of data will provide the greatest benefits. Dictionary gains are mostly effective in the first few KB. Then, the compression algorithm will gradually use previously decoded content to better compress the rest of the file.

S2 further limits the dictionary to only be enabled on the first 64KB of a block.
This will remove any negative (speed) impacts of the dictionaries on bigger blocks.

Compression

Using the github_users_sample_set and a 64KB dictionary trained with zStandard the following sizes can be achieved.

Default Better Best
Without Dictionary 3362023 (44.92%) 3083163 (41.19%) 3057944 (40.86%)
With Dictionary 921524 (12.31%) 873154 (11.67%) 785503 bytes (10.49%)

So for highly repetitive content, this case provides an almost 3x reduction in size.

For less uniform data we will use the Go source code tree.
Compressing First 64KB of all .go files in go/src, Go 1.19.5, 8912 files, 51253563 bytes input:

Default Better Best
Without Dictionary 22955767 (44.79%) 20189613 (39.39% 19482828 (38.01%)
With Dictionary 19654568 (38.35%) 16289357 (31.78%) 15184589 (29.63%)
Saving/file 362 bytes 428 bytes 472 bytes

Creating Dictionaries

There are no tools to create dictionaries in S2.
However, there are multiple ways to create a useful dictionary:

Using a Sample File

If your input is very uniform, you can just use a sample file as the dictionary.

For example in the github_users_sample_set above, the average compression only goes up from
10.49% to 11.48% by using the first file as dictionary compared to using a dedicated dictionary.

    // Read a sample
    sample, err := os.ReadFile("sample.json")

    // Create a dictionary.
    dict := s2.MakeDict(sample, nil)
	
    // b := dict.Bytes() will provide a dictionary that can be saved
    // and reloaded with s2.NewDict(b).
	
    // To encode:
    encoded := dict.Encode(nil, file)

    // To decode:
    decoded, err := dict.Decode(nil, file)

Using Zstandard

Zstandard dictionaries can easily be converted to S2 dictionaries.

This can be helpful to generate dictionaries for files that don't have a fixed structure.

Example, with training set files placed in ./training-set:

λ zstd -r --train-fastcover training-set/* --maxdict=65536 -o name.dict

This will create a dictionary of 64KB, that can be converted to a dictionary like this:

    // Decode the Zstandard dictionary.
    insp, err := zstd.InspectDictionary(zdict)
    if err != nil {
        panic(err)
    }
	
    // We are only interested in the contents.
    // Assume that files start with "// Copyright (c) 2023".
    // Search for the longest match for that.
    // This may save a few bytes.
    dict := s2.MakeDict(insp.Content(), []byte("// Copyright (c) 2023"))

    // b := dict.Bytes() will provide a dictionary that can be saved
    // and reloaded with s2.NewDict(b).

    // We can now encode using this dictionary
    encodedWithDict := dict.Encode(nil, payload)

    // To decode content:
    decoded, err := dict.Decode(nil, encodedWithDict)

It is recommended to save the dictionary returned by b:= dict.Bytes(), since that will contain only the S2 dictionary.

This dictionary can later be loaded using s2.NewDict(b). The dictionary then no longer requires zstd to be initialized.

Also note how s2.MakeDict allows you to search for a common starting sequence of your files.
This can be omitted, at the expense of a few bytes.

Dictionary Encoding

Adding dictionaries allow providing a custom dictionary that will serve as lookup in the beginning of blocks.

A dictionary provides an initial repeat value that can be used to point to a common header.

Other than that the dictionary contains values that can be used as back-references.

Often used data should be placed at the end of the dictionary since offsets < 2048 bytes will be smaller.

Format

Dictionary content must at least 16 bytes and less or equal to 64KiB (65536 bytes).

Encoding: [repeat value (uvarint)][dictionary content...]

Before the dictionary content, an unsigned base-128 (uvarint) encoded value specifying the initial repeat offset.
This value is an offset into the dictionary content and not a back-reference offset,
so setting this to 0 will make the repeat value point to the first value of the dictionary.

The value must be less than the dictionary length-8.

Encoding

From the decoder point of view the dictionary content is seen as preceding the encoded content.

[dictionary content][decoded output]

Backreferences to the dictionary are encoded as ordinary backreferences that have an offset before the start of the decoded block.

Matches copying from the dictionary are not allowed to cross from the dictionary into the decoded data.
However, if a copy ends at the end of the dictionary the next repeat will point to the start of the decoded buffer, which is allowed.

The first match can be a repeat value, which will use the repeat offset stored in the dictionary.

When 64KB (65536 bytes) has been en/decoded it is no longer allowed to reference the dictionary,
neither by a copy nor repeat operations.
If the boundary is crossed while copying from the dictionary, the operation should complete,
but the next instruction is not allowed to reference the dictionary.

Valid blocks encoded without a dictionary can be decoded with any dictionary.
There are no checks whether the supplied dictionary is the correct for a block.
Because of this there is no overhead by using a dictionary.

Streams

For streams each block can use the dictionary.

The dictionary is not provided on the stream.

@klauspost klauspost changed the title WIP: Add s2 dictionaries s2: Add Dictionary support. Feb 19, 2023
@klauspost klauspost marked this pull request as ready for review February 19, 2023 11:31
@klauspost klauspost merged commit 5a210a0 into master Feb 26, 2023
kodiakhq bot pushed a commit to cloudquery/filetypes that referenced this pull request Mar 1, 2023
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [github.com/klauspost/compress](https://togithub.com/klauspost/compress) | indirect | minor | `v1.15.11` -> `v1.16.0` |

---

### ⚠ Dependency Lookup Warnings ⚠

Warnings were logged while processing this repo. Please check the Dependency Dashboard for more information.

---

### Release Notes

<details>
<summary>klauspost/compress</summary>

### [`v1.16.0`](https://togithub.com/klauspost/compress/releases/tag/v1.16.0)

[Compare Source](https://togithub.com/klauspost/compress/compare/v1.15.15...v1.16.0)

#### What's Changed

-   s2: Add Dictionary support by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#685
-   s2: Add Compression Size Estimate by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#752
-   s2: Add support for custom stream encoder by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#755
-   s2: Add LZ4 block converter by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#748
-   s2: Support io.ReaderAt in ReadSeeker by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#747
-   s2c/s2sx: Use concurrent decoding by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#746
-   tests: Upgrade to Go 1.20 by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#749
-   Update all (command) dependencies by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#758

**Full Changelog**: klauspost/compress@v1.15.15...v1.16.0

### [`v1.15.15`](https://togithub.com/klauspost/compress/releases/tag/v1.15.15)

[Compare Source](https://togithub.com/klauspost/compress/compare/v1.15.14...v1.15.15)

#### What's Changed

-   zstd: Add delta encoding support by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#728
-   huff0: Reduce bounds checking by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#734
-   huff0: Assembler improvements by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#736
-   deflate: Improve level 7-9 by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#739
-   gzhttp: Add SuffixETag() and DropETag() options to prevent ETag collisions on compressed responses by [@&#8203;willbicks](https://togithub.com/willbicks) in [klauspost/compress#740
-   zstd: Don't allocate dataStorage when using byteBuf by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#741
-   huff0: Speed up compression of short blocks by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#744
-   zstd: Handle dicts by pointer, always by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#743
-   fse: Optimize compression by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#745
-   Retract v1.14.1-v.1.14.3 by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#742

#### New Contributors

-   [@&#8203;willbicks](https://togithub.com/willbicks) made their first contribution in [klauspost/compress#740

**Full Changelog**: klauspost/compress@v1.15.14...v1.15.15

### [`v1.15.14`](https://togithub.com/klauspost/compress/releases/tag/v1.15.14)

[Compare Source](https://togithub.com/klauspost/compress/compare/v1.15.13...v1.15.14)

#### What's Changed

-   flate: Improve speed in big stateless blocks. by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#718
-   zstd: Trigger BCE by switching on lengths by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#716
-   zstd: Shave some instructions off the amd64 asm by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#720
-   export NoGzipResponseWriter for custom ResponseWriter wrappers by [@&#8203;harshavardhana](https://togithub.com/harshavardhana) in [klauspost/compress#722
-   s2: Add example for indexing and existing stream by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#723
-   tests: Tweak fuzz tests by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#719

#### New Contributors

-   [@&#8203;harshavardhana](https://togithub.com/harshavardhana) made their first contribution in [klauspost/compress#722

**Full Changelog**: klauspost/compress@v1.15.13...v1.15.14

### [`v1.15.13`](https://togithub.com/klauspost/compress/releases/tag/v1.15.13)

[Compare Source](https://togithub.com/klauspost/compress/compare/v1.15.12...v1.15.13)

#### What's Changed

-   zstd: Add MaxEncodedSize to encoder by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#691
-   zstd: Improve "best" end search by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#693
-   zstd: Replace bytes.Equal with smaller comparisons by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#695
-   zstd: Faster CRC checking/skipping by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#696
-   zstd: Rewrite matchLen to make it inlineable by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#701
-   zstd: Write table clearing in a way that the compiler recognizes by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#702
-   zstd: Use individual reset threshold by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#703
-   huff0: Check for zeros earlier in Scratch.countSimple by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#704
-   zstd: Improve best compression's match selection by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#705
-   zstd: Select best match using selection trees by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#706
-   zstd: sync xxhash with final accepted patch upstream by [@&#8203;lizthegrey](https://togithub.com/lizthegrey) in [klauspost/compress#707
-   zstd: Import xxhash v2.2.0 by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#708

**Full Changelog**: klauspost/compress@v1.15.12...v1.15.13

### [`v1.15.12`](https://togithub.com/klauspost/compress/releases/tag/v1.15.12)

[Compare Source](https://togithub.com/klauspost/compress/compare/v1.15.11...v1.15.12)

#### What's Changed

-   zstd: Tweak decoder allocs. by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#680
-   gzhttp: Always delete `HeaderNoCompression` by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#683

**Full Changelog**: klauspost/compress@v1.15.11...v1.15.12

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 3am on the first day of the month" (UTC), Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update again.

---

 - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

---

This PR has been generated by [Renovate Bot](https://togithub.com/renovatebot/renovate).
<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNC4xMDkuMSIsInVwZGF0ZWRJblZlciI6IjM0LjE1NC4wIn0=-->
@klauspost klauspost deleted the s2-add-dictionaries branch March 10, 2023 20:00
kodiakhq bot pushed a commit to cloudquery/plugin-pb-go that referenced this pull request Aug 1, 2023
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [github.com/klauspost/compress](https://togithub.com/klauspost/compress) | indirect | minor | `v1.15.15` -> `v1.16.7` |

---

### Release Notes

<details>
<summary>klauspost/compress (github.com/klauspost/compress)</summary>

### [`v1.16.7`](https://togithub.com/klauspost/compress/releases/tag/v1.16.7)

[Compare Source](https://togithub.com/klauspost/compress/compare/v1.16.6...v1.16.7)

#### What's Changed

-   zstd: Fix default level first dictionary encode by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#829
-   docs: Fix typo in security advisory URL by [@&#8203;vcabbage](https://togithub.com/vcabbage) in [klauspost/compress#830
-   s2: add GetBufferCapacity() method by [@&#8203;GiedriusS](https://togithub.com/GiedriusS) in [klauspost/compress#832

#### New Contributors

-   [@&#8203;vcabbage](https://togithub.com/vcabbage) made their first contribution in [klauspost/compress#830
-   [@&#8203;GiedriusS](https://togithub.com/GiedriusS) made their first contribution in [klauspost/compress#832

**Full Changelog**: klauspost/compress@v1.16.6...v1.16.7

### [`v1.16.6`](https://togithub.com/klauspost/compress/releases/tag/v1.16.6)

[Compare Source](https://togithub.com/klauspost/compress/compare/v1.16.5...v1.16.6)

#### What's Changed

-   zstd: correctly ignore WithEncoderPadding(1) by [@&#8203;ianlancetaylor](https://togithub.com/ianlancetaylor) in [klauspost/compress#806
-   gzhttp: Handle informational headers by [@&#8203;rtribotte](https://togithub.com/rtribotte) in [klauspost/compress#815
-   zstd: Add amd64 match length assembly by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#824
-   s2: Improve Better compression slightly by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#663
-   s2: Clean up matchlen assembly by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#825

#### New Contributors

-   [@&#8203;rtribotte](https://togithub.com/rtribotte) made their first contribution in [klauspost/compress#815
-   [@&#8203;dveeden](https://togithub.com/dveeden) made their first contribution in [klauspost/compress#816

**Full Changelog**: klauspost/compress@v1.16.5...v1.16.6

### [`v1.16.5`](https://togithub.com/klauspost/compress/releases/tag/v1.16.5)

[Compare Source](https://togithub.com/klauspost/compress/compare/v1.16.4...v1.16.5)

#### What's Changed

-   zstd: readByte needs to use io.ReadFull by [@&#8203;jnoxon](https://togithub.com/jnoxon) in [klauspost/compress#802
-   gzip: Fix WriterTo after initial read by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#804

#### New Contributors

-   [@&#8203;jnoxon](https://togithub.com/jnoxon) made their first contribution in [klauspost/compress#802

**Full Changelog**: klauspost/compress@v1.16.4...v1.16.5

### [`v1.16.4`](https://togithub.com/klauspost/compress/releases/tag/v1.16.4)

[Compare Source](https://togithub.com/klauspost/compress/compare/v1.16.3...v1.16.4)

#### What's Changed

-   s2: Fix huge block overflow by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#779
-   s2: Allow CustomEncoder fallback by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#780
-   zstd: Fix amd64 not always detecting corrupt data by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#785
-   zstd: Improve zstd best efficiency by [@&#8203;klauspost](https://togithub.com/klauspost) and [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#784
-   zstd: Make load(32|64)32 safer and smaller by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#788
-   zstd: Fix quick reject on long backmatches by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#787
-   zstd: Revert table size change  by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#789
-   zstd: Respect WithAllLitEntropyCompression by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#792
-   zstd: Fix back-referenced offset by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#793
-   zstd: Load source value at start of loop by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#794
-   zstd: Shorten checksum code by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#795
-   zstd: Fix fallback on incompressible block by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#798
-   gzhttp: Suppport ResponseWriter Unwrap() in gzhttp handler by [@&#8203;jgimenez](https://togithub.com/jgimenez) in [klauspost/compress#799

#### New Contributors

-   [@&#8203;jgimenez](https://togithub.com/jgimenez) made their first contribution in [klauspost/compress#799

**Full Changelog**: klauspost/compress@v1.16.3...v1.16.4

### [`v1.16.3`](https://togithub.com/klauspost/compress/releases/tag/v1.16.3)

[Compare Source](https://togithub.com/klauspost/compress/compare/v1.16.2...v1.16.3)

**Full Changelog**: klauspost/compress@v1.16.2...v1.16.3

### [`v1.16.2`](https://togithub.com/klauspost/compress/releases/tag/v1.16.2)

[Compare Source](https://togithub.com/klauspost/compress/compare/v1.16.1...v1.16.2)

#### What's Changed

-   Fix Goreleaser permissions by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#777

**Full Changelog**: klauspost/compress@v1.16.1...v1.16.2

### [`v1.16.1`](https://togithub.com/klauspost/compress/releases/tag/v1.16.1)

[Compare Source](https://togithub.com/klauspost/compress/compare/v1.16.0...v1.16.1)

#### What's Changed

-   zstd: Speed up + improve best encoder by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#776
-   s2: Add Intel LZ4s converter by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#766
-   gzhttp: Add BREACH mitigation by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#762
-   gzhttp: Remove a few unneeded allocs by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#768
-   gzhttp: Fix crypto/rand.Read usage by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#770
-   gzhttp: Use SHA256 as paranoid option by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#769
-   gzhttp: Use strings for randomJitter to skip a copy by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#767
-   zstd: Fix ineffective block size check by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#771
-   zstd: Check FSE init values by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#772
-   zstd: Report EOF from byteBuf.readBig by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#773
-   huff0: Speed up compress1xDo by [@&#8203;greatroar](https://togithub.com/greatroar) in [klauspost/compress#774
-   tests: Remove fuzz printing by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#775
-   tests: Add CICD Fuzz testing by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#763
-   ci: set minimal permissions to GitHub Workflows by [@&#8203;diogoteles08](https://togithub.com/diogoteles08) in [klauspost/compress#765

#### New Contributors

-   [@&#8203;diogoteles08](https://togithub.com/diogoteles08) made their first contribution in [klauspost/compress#765

**Full Changelog**: klauspost/compress@v1.16.0...v1.16.1

### [`v1.16.0`](https://togithub.com/klauspost/compress/releases/tag/v1.16.0)

[Compare Source](https://togithub.com/klauspost/compress/compare/v1.15.15...v1.16.0)

#### What's Changed

-   s2: Add Dictionary support by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#685
-   s2: Add Compression Size Estimate by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#752
-   s2: Add support for custom stream encoder by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#755
-   s2: Add LZ4 block converter by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#748
-   s2: Support io.ReaderAt in ReadSeeker by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#747
-   s2c/s2sx: Use concurrent decoding by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#746
-   tests: Upgrade to Go 1.20 by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#749
-   Update all (command) dependencies by [@&#8203;klauspost](https://togithub.com/klauspost) in [klauspost/compress#758

**Full Changelog**: klauspost/compress@v1.15.15...v1.16.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "before 4am on the first day of the month" (UTC), Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update again.

---

 - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

---

This PR has been generated by [Renovate Bot](https://togithub.com/renovatebot/renovate).
<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNi4yNi4xIiwidXBkYXRlZEluVmVyIjoiMzYuMjYuMSIsInRhcmdldEJyYW5jaCI6Im1haW4ifQ==-->
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant