Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Async get is blocking #1849

Open
ldechamps opened this issue Jan 22, 2024 · 3 comments
Open

Async get is blocking #1849

ldechamps opened this issue Jan 22, 2024 · 3 comments

Comments

@ldechamps
Copy link

ldechamps commented Jan 22, 2024

We have a problem with the way ioRedis works. When retrieving a value via the get command from a stored value (Buffer type), there seems to be a blocking element at the asynchronous level (this effect is accentuated when values increase in size).
For values of 2Mo , blocking times of 3ms can occur (less size, less blocking time but still blocking), which is worrying when Redis is used intensively.
Is the problem known? Is there a solution? What might the problem be?

[14:27:54.658] INFO (20861): 595 - get test 
[14:27:54.661] INFO (20861): ping
[14:27:54.666] INFO (20861): 596 - get test
[14:27:54.668] INFO (20861): ping
[14:27:54.670] INFO (20861): 597 - get test 
[14:27:54.671] INFO (20861): ping
[14:27:54.674] INFO (20861): 598 - get test

Note : Ping is set to 1ms interval

Regards

@ldechamps
Copy link
Author

I think I've found a clue: when retrieving the stream, the DataHandler uses redis-parser. At the end of the retrieval, it reconstitutes the entire data stream. This last operation may be blocking:

concatBulkBuffer

Is there a bypass? Can ioredis stream the get response directly?

@ldechamps
Copy link
Author

I'm relaunching this thread.
we've made little progress on this problem. Despite optimizing the data stream reconstruction, we still have blocking problems. And we realized that despite using the auto-pipeline mode, data recovery was sequential. In other words, if you retrieve cache1 followed by cache2 etc., then the last cache was retrieved much later, even though the command was sent at the same time as the others.
Is there a way or a command to do them in parallel?

@jcyh0120
Copy link

Facing this problem, too. I made a simple test by stream XREAD with block and then console.log the result. It has to wait until the XREAD return which block the whole process.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants