You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
we have two interfaces to puma that we can tweak: "--workers" (aka processes) and "--threads".
Throughout the doc there are warnings about MRI because of the GVL, so let me be clear: I'm in MRI Rails, I'm using "traditional" blocking service calls within my Rails app (things like RestClient, Savon, open3, etc.) -- no 'async' calls.
one worker, MRI
For the following two use cases, let's say we have workers: 1, threads: 5 (default) on MRI Ruby.
Now, Puma, AFAICT, sets up the socket using a non-blocking nio4r socket wrapped by a Reactor. This means that separate requests to Puma could, in fact, be processed in parallel in separate threads... however in MRI, those threads hit the GVL, so they are not parallel if workers=1, merely concurrent? But maybe that's ok... maybe some stuff gets through.
within a Rails controller action?
let's say I get a single request to my controller action. That's one Reactor used by Puma to run my Rails code.
my action calls 3 different rest services (A,B,C) sequentially to get answers. I use 3 simple rest-client get() lines and process the results after each request completes before going to the next one, those requests are blocking. I'm pretty sure that no concurrency happens with only a single request and a single Reactor in play, right?
response=RestClient.get'http://exampleA.com/resource'# do something with response# next call B# next C
But, if I wanted to, maybe I could use a library like Typhoeus or even nio4r and create rest callbacks... now maybe I get concurrency even with the GVL in place. Maybe there's ways of doing this with RestClient too, but it wouldn't happen automatically, right?
handling separate requests?
let's say I get 3 separate requests at the same time to my Rails controller action. Puma sees them and each request is handled by a different Reactor from the thread pool (in this case, the pool is 5, and we use 1 Reactor for each request, so we are using 3 Reactors at the same time... but because they are in 1 process we are talking concurrency, not parallelism). Because each Reactor delegates to a copy of the Rails Controller instance, each request has it's own separate state, so these should be concurrent requests?
is it true that my first reactor might call service A, then wait IO bound, and the second reactor calls A, waits IO, then the third reactor calls A and waits? If so, then the threads have given me some concurrency in spite of the GVL because the separate requests are able to call A,A,A without waiting for the first A to complete, or even the whole first Reactor to complete.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi puma community. I'm trying to understand parallelism vs concurrency in puma, but I'm pretty confused about it. Here's what I found:
we have two interfaces to puma that we can tweak: "--workers" (aka processes) and "--threads".
Throughout the doc there are warnings about MRI because of the GVL, so let me be clear: I'm in MRI Rails, I'm using "traditional" blocking service calls within my Rails app (things like RestClient, Savon, open3, etc.) -- no 'async' calls.
one worker, MRI
For the following two use cases, let's say we have workers: 1, threads: 5 (default) on MRI Ruby.
Now, Puma, AFAICT, sets up the socket using a non-blocking nio4r socket wrapped by a Reactor. This means that separate requests to Puma could, in fact, be processed in parallel in separate threads... however in MRI, those threads hit the GVL, so they are not parallel if workers=1, merely concurrent? But maybe that's ok... maybe some stuff gets through.
within a Rails controller action?
let's say I get a single request to my controller action. That's one Reactor used by Puma to run my Rails code.
my action calls 3 different rest services (A,B,C) sequentially to get answers. I use 3 simple rest-client
get()
lines and process the results after each request completes before going to the next one, those requests are blocking. I'm pretty sure that no concurrency happens with only a single request and a single Reactor in play, right?But, if I wanted to, maybe I could use a library like Typhoeus or even nio4r and create rest callbacks... now maybe I get concurrency even with the GVL in place. Maybe there's ways of doing this with RestClient too, but it wouldn't happen automatically, right?
handling separate requests?
let's say I get 3 separate requests at the same time to my Rails controller action. Puma sees them and each request is handled by a different Reactor from the thread pool (in this case, the pool is 5, and we use 1 Reactor for each request, so we are using 3 Reactors at the same time... but because they are in 1 process we are talking concurrency, not parallelism). Because each Reactor delegates to a copy of the Rails Controller instance, each request has it's own separate state, so these should be concurrent requests?
is it true that my first reactor might call service A, then wait IO bound, and the second reactor calls A, waits IO, then the third reactor calls A and waits? If so, then the threads have given me some concurrency in spite of the GVL because the separate requests are able to call A,A,A without waiting for the first A to complete, or even the whole first Reactor to complete.
Beta Was this translation helpful? Give feedback.
All reactions