Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(zstd): default back to GOMAXPROCS concurrency #2404

Merged
merged 1 commit into from Dec 21, 2022
Merged

Conversation

bgreenlee
Copy link
Contributor

After upgrading Sarama from 1.27.0 to 1.37.2, we noticed a significant performance regression in one of our services. This service runs on a very large (96 CPU) instance, and normally processes about 120K messages/sec. After the upgrade, it could only do about half that.

We traced the issue to this change in klauspost/compress. Since Sarama uses the defaults when creating a new reader, concurrency is limited to 4.

By setting WithDecoderConcurrency(0), concurrency is set to GOMAXPROCS (which in our app is set to runtime.NumCPU() == 96. When we deployed our service with this change, our performance issue disappeared.

An upstream change in klauspost/compress#498 changed the default decoder
concurrency from GOMAXPROCS to a maximum of 4. Explicitly pass a value
of 0 via WithDecoderConcurrency so that GOMAXPROCS is used by default
again.
@bgreenlee
Copy link
Contributor Author

I signed the CLA. I don't see a way to rerun the job.

Copy link
Collaborator

@dnwe dnwe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, that's a good spot — I squashed the commits and have approved the status checks, will merge once they go through

@dnwe dnwe changed the title Fix zstd performance regression fix(zstd): default back to GOMAXPROCS concurrency Dec 21, 2022
@dnwe dnwe merged commit 779fb1f into IBM:main Dec 21, 2022
@dnwe dnwe added the fix label Dec 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants