New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement batching methods on MpmcArrayQueue #211
Conversation
f9a525c
to
fc7e14c
Compare
fc7e14c
to
efcb607
Compare
Pull Request Test Coverage Report for Build 440
💛 - Coveralls |
it is addressing #180 |
@@ -128,10 +128,13 @@ final boolean casConsumerIndex(long expect, long newValue) | |||
*/ | |||
public class MpmcArrayQueue<E> extends MpmcArrayQueueL3Pad<E> | |||
{ | |||
public static final int MAX_LOOK_AHEAD_STEP = Integer.getInteger("jctools.mpmc.max.lookahead.step", 4096); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should probably use PortableJvmInfo.RECOMENDED_OFFER_BATCH
instead. On SPSC there's no reason not to claim large chunks for an uncontended producer, but for MPSC/MPMC the assumption should be some contention. Given that the empty slots in the queue will result in consumer 'bubbles' (queue not empty but slot is empty) we can potentially have an issue if a producer takes a while to fill up the slots it claims.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Strike that, this is just setting a max
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Looking forward to this. When is the next release scheduled? |
Sorry for the slow response. Totally swamped ATM, will release around end of October. |
@felixbarny I've implemented the batch methods assuming mostly single threaded usage to get the best of it: at the first failed attempt (due to contention) to fill/drain in batches it will just fill/drain one by one. |
I'm using an MPMC queue as the basis for an object pool. My scenario is that I have multiple consumers and, under normal circumstances, a single producer. In rare cases, there can be multiple producers. I was hoping that both the producer and the consumers would benefit from batch operations. The producer could batch up multiple objects and add them to the object pool/queue in batch. Also, the consumers could get multiple objects and buffer them thread-local. By doing this, my expectation would be to gain performance by reducing the amount of CAS operations (1 per batch vs 1 per element). |
I would say for sure that it will do it for the producers, given the "mostly single-threaded" scenario, will benefit a lot by batch fill
That's the ratio behind this PR indeed :) |
It adds batch drain/fill methods for MpmcArrayQueue