Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IO::EINPROGRESSWaitWritable with Redis 6 #972

Open
beanieboi opened this issue Jan 15, 2021 · 17 comments
Open

IO::EINPROGRESSWaitWritable with Redis 6 #972

beanieboi opened this issue Jan 15, 2021 · 17 comments

Comments

@beanieboi
Copy link

Hi,

we recently upgraded to Redis 6 and started using the built-in TLS. Since the upgrade we see frequent IO::EINPROGRESSWaitWritable: Operation now in progress - connect(2) would block

versions:
ruby: 2.7.2
redis-rb: 4.2.5
redis: 6.0.8

We use Redis in combination with Sidekiq.
What we found out so far:

  • Sidekiq is using the default configuration with the ConnectionPool gem.
  • Sidekiq gets a connection from the pool
  • redis-rb runs ensure_connected (
    if connected?
    unless inherit_socket? || Process.pid == @pid
    raise InheritedError,
    "Tried to use a connection from a child process without reconnecting. " \
    "You need to reconnect to Redis after forking " \
    "or set :inherit_socket to true."
    end
    else
    connect
    end
    )
  • the connection from the pool is dead, redis-rb tries to reconnect and fails.

This is happening in bursts every 5min which is also the timeout setting on our redis.
Setting timeout on the redis to 0 (disable) "fixes" this problem.

What I think is happening. The connection in the pool was timed out by redis and when we try to use it, it fails to connect again.

This only started to happen after upgrading from Redis 4 to 6. Our old Redis 4 had the same 300s timeout. I believe that I might be related to TLS.

stack trace:

File /app/vendor/ruby-2.7.2/lib/ruby/2.7.0/socket.rb line 1214 in __connect_nonblock
File /app/vendor/ruby-2.7.2/lib/ruby/2.7.0/socket.rb line 1214 in connect_nonblock
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/connection/ruby.rb line 154 in connect_addrinfo
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/connection/ruby.rb line 192 in block in connect
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/connection/ruby.rb line 190 in each
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/connection/ruby.rb line 190 in each_with_index
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/connection/ruby.rb line 190 in connect
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/connection/ruby.rb line 243 in connect
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/connection/ruby.rb line 302 in connect
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/client.rb line 354 in establish_connection
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/client.rb line 112 in block in connect
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/client.rb line 313 in with_reconnect
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/client.rb line 111 in connect
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/client.rb line 386 in ensure_connected
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/client.rb line 238 in block in process
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/client.rb line 325 in logging
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/client.rb line 237 in process
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/client.rb line 131 in call
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/client.rb line 226 in block in call_with_timeout
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/client.rb line 300 in with_socket_timeout
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis/client.rb line 225 in call_with_timeout
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis.rb line 1224 in block in _bpop
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis.rb line 69 in block in synchronize
File /app/vendor/ruby-2.7.2/lib/ruby/2.7.0/monitor.rb line 202 in synchronize
File /app/vendor/ruby-2.7.2/lib/ruby/2.7.0/monitor.rb line 202 in mon_synchronize
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis.rb line 69 in synchronize
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis.rb line 1221 in _bpop
File /app/vendor/bundle/ruby/2.7.0/gems/redis-4.2.5/lib/redis.rb line 1266 in brpop
File /app/vendor/bundle/ruby/2.7.0/gems/sidekiq-6.1.2/lib/sidekiq/fetch.rb line 39 in block in retrieve_work
File /app/vendor/bundle/ruby/2.7.0/gems/sidekiq-6.1.2/lib/sidekiq.rb line 98 in block in redis
File /app/vendor/bundle/ruby/2.7.0/gems/connection_pool-2.2.3/lib/connection_pool.rb line 63 in block (2 levels) in with
File /app/vendor/bundle/ruby/2.7.0/gems/connection_pool-2.2.3/lib/connection_pool.rb line 62 in handle_interrupt
File /app/vendor/bundle/ruby/2.7.0/gems/connection_pool-2.2.3/lib/connection_pool.rb line 62 in block in with
File /app/vendor/bundle/ruby/2.7.0/gems/connection_pool-2.2.3/lib/connection_pool.rb line 59 in handle_interrupt
File /app/vendor/bundle/ruby/2.7.0/gems/connection_pool-2.2.3/lib/connection_pool.rb line 59 in with
File /app/vendor/bundle/ruby/2.7.0/gems/sidekiq-6.1.2/lib/sidekiq.rb line 95 in redis
File /app/vendor/bundle/ruby/2.7.0/gems/sidekiq-6.1.2/lib/sidekiq/fetch.rb line 39 in retrieve_work
File /app/vendor/bundle/ruby/2.7.0/gems/sidekiq-6.1.2/lib/sidekiq/processor.rb line 83 in get_one
File /app/vendor/bundle/ruby/2.7.0/gems/sidekiq-6.1.2/lib/sidekiq/processor.rb line 95 in fetch
File /app/vendor/bundle/ruby/2.7.0/gems/sidekiq-6.1.2/lib/sidekiq/processor.rb line 77 in process_one
File /app/vendor/bundle/ruby/2.7.0/gems/sidekiq-6.1.2/lib/sidekiq/processor.rb line 68 in run
File /app/vendor/bundle/ruby/2.7.0/gems/sidekiq-6.1.2/lib/sidekiq/util.rb line 15 in watchdog
File /app/vendor/bundle/ruby/2.7.0/gems/sidekiq-6.1.2/lib/sidekiq/util.rb line 24 in block in safe_thread

So far I failed to reproduce this issue in a new isolated project.

let me know if there is more information that I can provide.

@beanieboi
Copy link
Author

@supercaracal We do not use threaded IO, it is disabled.

@byroot
Copy link
Collaborator

byroot commented Jan 16, 2021

Looks like our code might be incorrect:

def self.connect_addrinfo(addrinfo, port, timeout)
sock = new(::Socket.const_get(addrinfo[0]), Socket::SOCK_STREAM, 0)
sockaddr = ::Socket.pack_sockaddr_in(port, addrinfo[3])
begin
sock.connect_nonblock(sockaddr)
rescue Errno::EINPROGRESS
raise TimeoutError unless sock.wait_writable(timeout)
begin
sock.connect_nonblock(sockaddr)
rescue Errno::EISCONN
end
end
sock
end

The documentation for connect_nonblock explictly handle IO::WaitWritable
https://ruby-doc.org/stdlib-2.7.0/libdoc/socket/rdoc/Socket.html#method-i-connect_nonblock

begin # emulate blocking connect
  socket.connect_nonblock(sockaddr)
rescue IO::WaitWritable
  IO.select(nil, [socket]) # wait 3-way handshake completion
  begin
    socket.connect_nonblock(sockaddr) # check connection failure
  rescue Errno::EISCONN
  end
end
>> IO::EINPROGRESSWaitWritable < IO::WaitWritable
=> true

@byroot
Copy link
Collaborator

byroot commented Jan 16, 2021

Hum, actually:

>> IO::EINPROGRESSWaitWritable < Errno::EINPROGRESS
=> true

So we should enter the rescue and wait_writable. 🤔

@supercaracal
Copy link
Contributor

It seems that the following logic should be called if we use SSL/TLS connection.

begin
# Initiate the socket connection in the background. If it doesn't fail
# immediately it will raise an IO::WaitWritable (Errno::EINPROGRESS)
# indicating the connection is in progress.
# Unlike waiting for a tcp socket to connect, you can't time out ssl socket
# connections during the connect phase properly, because IO.select only partially works.
# Instead, you have to retry.
ssl_sock.connect_nonblock
rescue Errno::EAGAIN, Errno::EWOULDBLOCK, IO::WaitReadable
if ssl_sock.wait_readable(timeout)
retry
else
raise TimeoutError
end
rescue IO::WaitWritable
if ssl_sock.wait_writable(timeout)
retry
else
raise TimeoutError
end
end

@beanieboi Could you inform us of your specified connecting options?

@byroot
Copy link
Collaborator

byroot commented Jan 16, 2021

It seems that the following logic should be called if we use SSL/TLS connection

The provided backtrace says line 192, so it's a regular TCPSocket. Based on the code itself that backtrace should be impossible as the Exception reported should be caught by the rescue. So really I'm puzzled as to what is happening here.

@beanieboi
Copy link
Author

beanieboi commented Jan 16, 2021

our Sidekiq configuration looks like this:

We have 2 Redis and they are configured the same. We have 1 Redis just for Sidekiq, we connect like this:

Sidekiq.configure_client do |config|
  config.redis = {
    url: sidekiq_maybe_tls_url,
    driver: :ruby,
    pool_timeout: 5,
    size: 5,
    id: "#{internal-identifier}",
    network_timeout: 5,
    reconnect_attempts: 1,
    ssl_params: { verify_mode: OpenSSL::SSL::VERIFY_NONE }
  }
end
Sidekiq.configure_server do |config|
  config.options[:job_logger] = NullLogger
  config.redis = {
    url: sidekiq_maybe_tls_url,
    driver: :ruby,
    pool_timeout: 5,
    id: "#{internal-identifier}",
    network_timeout: 5,
    reconnect_attempts: 1,
    ssl_params: { verify_mode: OpenSSL::SSL::VERIFY_NONE }
  }
end

sidekiq_maybe_tls_url uses redis:// or rediss:// depending on which machine it runs on. We are currently rolling out TLS and have it only enabled for some of our servers. I just verified that the exception is happening without TLS.

We connect to our 2nd redis like this:

ConnectionPool.new(size: size, timeout: 5) do
  ::Redis.new(
    url: Config.redis_url,
    driver: :ruby,
    id: id,
    timeout: 5,
    reconnect_attempts: 1,
    ssl_params: { verify_mode: OpenSSL::SSL::VERIFY_NONE }
  )
end

@supercaracal
Copy link
Contributor

@byroot I'm sorry for my overlooks. Thank you for your explanation. I understood.

@beanieboi Is the Redis' tls-auth-clients directive either no or optional ?

@beanieboi
Copy link
Author

I believe that I might be related to TLS.

I have to correct myself when I said in the beginning. it definitely happens when we connect to a normal redis:// without TLS. I don't want to lead you on a wrong path.

@beanieboi Is the Redis' tls-auth-clients directive either no or optional ?

we have tls-auth-clients set to no

@JuanitoFatas
Copy link

I got this same error trace when redis is not installed / stopped. Start the redis fixed it (brew services start redis).

@bpo
Copy link
Contributor

bpo commented Apr 15, 2022

@beanieboi would you mind confirming if this is still an issue and/or what ended up happening?

@bf4
Copy link

bf4 commented Apr 28, 2022

Fwiw, we just saw this running Heroku Redis with ssl, redis-rb 4.6.0 Redis 6.2.3 Ruby 2.7.6 but event is very rare

Redis::CannotConnectError: Error connecting to Redis on ec2-ip.compute-1.amazonaws.com:port (Errno::ECONNREFUSED) (Most recent call first)
Errno::ECONNREFUSED: Connection refused - connect(2) for
IO::EINPROGRESSWaitWritable: Operation now in progress - connect(2) would block

backtrace

vendor/ruby-2.7.6/lib/ruby/2.7.0/socket.rb line 1214 in __connect_nonblock
vendor/ruby-2.7.6/lib/ruby/2.7.0/socket.rb line 1214 in connect_nonblock
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/connection/ruby.rb line 158 in connect_addrinfo
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/connection/ruby.rb line 196 in block in connect
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/connection/ruby.rb line 194 in each
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/connection/ruby.rb line 194 in each_with_index
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/connection/ruby.rb line 194 in connect
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/connection/ruby.rb line 247 in connect
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/connection/ruby.rb line 306 in connect
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/client.rb line 385 in establish_connection
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/client.rb line 115 in block in connect
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/client.rb line 344 in with_reconnect
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/client.rb line 114 in connect
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/client.rb line 417 in ensure_connected
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/client.rb line 269 in block in process
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/client.rb line 356 in logging
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/client.rb line 268 in process
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/client.rb line 161 in call
vendor/bundle/ruby/2.7.0/gems/scout_apm-5.1.1/lib/scout_apm/instruments/redis.rb line 32 in block in call_with_scout_instruments
vendor/bundle/ruby/2.7.0/gems/scout_apm-5.1.1/lib/scout_apm/tracer.rb line 34 in instrument
vendor/bundle/ruby/2.7.0/gems/scout_apm-5.1.1/lib/scout_apm/tracer.rb line 44 in instrument
vendor/bundle/ruby/2.7.0/gems/scout_apm-5.1.1/lib/scout_apm/instruments/redis.rb line 31 in call_with_scout_instruments
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/client.rb line 257 in block in call_with_timeout
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/client.rb line 331 in with_socket_timeout
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/client.rb line 256 in call_with_timeout
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis.rb line 269 in block in send_blocking_command
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis.rb line 268 in synchronize
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis.rb line 268 in send_blocking_command
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/commands/lists.rb line 183 in brpoplpush
vendor/bundle/ruby/2.7.0/gems/redis-namespace-1.8.2/lib/redis/namespace.rb line 476 in call_with_namespace
vendor/bundle/ruby/2.7.0/gems/redis-namespace-1.8.2/lib/redis/namespace.rb line 352 in block (2 levels) in <class:Namespace>
vendor/bundle/ruby/2.7.0/gems/sidekiq-pro-5.3.1/lib/sidekiq/pro/super_fetch.rb line 345 in block in strict
vendor/bundle/ruby/2.7.0/gems/sidekiq-6.4.2/lib/sidekiq.rb line 95 in block in redis
vendor/bundle/ruby/2.7.0/gems/connection_pool-2.2.5/lib/connection_pool.rb line 63 in block (2 levels) in with

@schneems
Copy link

I'm unclear what a EINPROGRESSWaitWritable is even, what causes it so I found this SO https://stackoverflow.com/questions/8277970/what-are-possible-reason-for-socket-error-einprogress-in-solaris.

You have a non-blocking socket and you are calling connect() in it. Since connect() needs the 3-way handshake to happen (so a network roundtrip), it either blocks waiting for the SYN-ACK in blocking sockets, or gives you some indication that it hasn't succeded yet in non-blocking sockets. Normally, non-blocking sockets return EAGAIN/EWOULDBLOCK to tell you that they couldn't progress and you should try again: this is not exactly your case, connect() returns EAGAIN/EWOULDBLOCK when there are no free ephemeral ports to tell you that you should try again later; so there is another error for non-blocking connect: EINPROGRESS, which tells you that the operation is in progress and you should check its status later.

So it sounds like (based solely on this information) that the process is async and the status should be checked again later. It sounds like in Ruby this is related to a timeout where we are waiting for a connection, but then hit our timeout and then bubble up the exception.

Right now it's unclear the exact circumstances that cause this issue or what specific set of circumstances enable the failure. For example when I debugged connection errors with Puma on Heroku first I had to reproduce the behavior https://blog.heroku.com/puma-4-hammering-out-h13s-a-debugging-story. In that case I used a fixture to move the failure mode from production to local https://github.com/hunterloftis/heroku-node-errcodes/tree/master/h13.

In debugging I like to take a step back and think in terms of the scientific method. What are our theories as to why this happens and how could we test those hypothesis?

Some things I'm curious about at a high level: What can be done to resolve the issue once it begins? After it occurs is it a persistent issue or does it go away? One theory is that the machine making the request is put into a "bad state" possibly due to killing a thread and that future requests will also fail in the same way until the machine is restarted or until a process is killed.

We could test this by catching the failure, logging it and then performing some action: For example retry the same request, it it succeeds then the problem does not persist beyond the original request. We could also take another action such as killing a process so that it's memory is then in a "clean" state.

but event is very rare

This implies that the event does not persist. If that's the case then I would be curious about what is the state of the overall system when this happens. Can we correlate the time of this happening with something else? Perhaps a cron job coincides with a large amount of work from a customer endpoint (for example). Collecting more information around when this is happening could be good such as number of connections on the data store, memory, load etc. maybe something stands out as an anomaly.

Another approach is: find an input that changes the output (even if it doesn't fix it). If this seems related to timeout and disabling the timeout removes the problem, does this mask the issue or is it related to the issue? Disabling the timeout clearly stops the issue because it would wait forever instead of raising the issue. The questions I have are: do those requests eventually finish or would they wait forever until some other timeout occurs (such as a server is rotated every 24 hours). If changing the timeout value can change the nature of our failure mode, is there a correlation with Redis 6? Does redis 6 do something different in terms of performance when connecting that might mandate a longer default timeout value or is this a coincidence? Sometimes problems that are caused by increased load aren't noticed until other changes are made. For instance if a database is close to some limit, developers might not notice until they take a maintenance action and are inspecting database logs a bit closer etc. Then how could we test that? For example: Is it feasible to run a Redis 5 DB for a bit to see if the error occurs within a reasonable timeframe?

Also to consider there may be multiple different failure modes that all appear the same but are different. As juanito pointed out trying to connect to a non-existant redis instance may give this error sometimes.

All of that to say: I think we need more theories of what failure mode is causing this error and associated steps to prove or disprove them.

@byroot
Copy link
Collaborator

byroot commented Apr 29, 2022

I'm unclear what a EINPROGRESSWaitWritable is even,

From socket.c

/* :nodoc: */
static VALUE
sock_connect_nonblock(VALUE sock, VALUE addr, VALUE ex)
{
    VALUE rai;
    rb_io_t *fptr;
    int n;

    SockAddrStringValueWithAddrinfo(addr, rai);
    addr = rb_str_new4(addr);
    GetOpenFile(sock, fptr);
    rb_io_set_nonblock(fptr);
    n = connect(fptr->fd, (struct sockaddr*)RSTRING_PTR(addr), RSTRING_SOCKLEN(addr));
    if (n < 0) {
	int e = errno;
	if (e == EINPROGRESS) {
            if (ex == Qfalse) {
                return sym_wait_writable;
            }
            rb_readwrite_syserr_fail(RB_IO_WAIT_WRITABLE, e, "connect(2) would block");
	}
	if (e == EISCONN) {
            if (ex == Qfalse) {
                return INT2FIX(0);
            }
	}
	rsock_syserr_fail_raddrinfo_or_sockaddr(e, "connect(2)", addr, rai);
    }

    return INT2FIX(n);
}

Then later:

	  case EINPROGRESS:
            c = rb_eEINPROGRESSWaitWritable;
	    break;

So this exception is raised when connect(2) fail with EINPROGRESS.

So reference manual says: https://man7.org/linux/man-pages/man2/connect.2.html

EINPROGRESS
The socket is nonblocking and the connection cannot be
completed immediately. (UNIX domain sockets failed with
EAGAIN instead.) It is possible to select(2) or poll(2)
for completion by selecting the socket for writing. After
select(2) indicates writability, use getsockopt(2) to read
the SO_ERROR option at level SOL_SOCKET to determine
whether connect() completed successfully (SO_ERROR is
zero) or unsuccessfully (SO_ERROR is one of the usual
error codes listed here, explaining the reason for the
failure).

Which is kinda expected as we explicitly rescue Errno::EINPROGRESS and wait when it is raised:

          begin
            sock.connect_nonblock(sockaddr)
          rescue Errno::EINPROGRESS
            raise TimeoutError unless sock.wait_writable(timeout)

            begin
              sock.connect_nonblock(sockaddr)
            rescue Errno::EISCONN
            end
          end

So it should work, but clearly it doesn't.

What makes no sense to me is the reported backtraces:

vendor/ruby-2.7.6/lib/ruby/2.7.0/socket.rb line 1214 in __connect_nonblock
vendor/ruby-2.7.6/lib/ruby/2.7.0/socket.rb line 1214 in connect_nonblock
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/connection/ruby.rb line 158 in connect_addrinfo
vendor/bundle/ruby/2.7.0/gems/redis-4.6.0/lib/redis/connection/ruby.rb line 196 in block in connect

That line 196 is the first call to connect_nonblock, so if it indeed raised IO::EINPROGRESSWaitWritable, it should have entered the rescue Errno::EINPROGRESS because:

>> IO::EINPROGRESSWaitWritable < Errno::EINPROGRESS
=> true

So over a year later, this bug report is stil at the WTF stage to me, unless somehow IO::EINPROGRESSWaitWritable < Errno::EINPROGRESS isn't true in some rubies? But the backtrace suggest Ruby 2.7, and I just tested with it so...

@bf4
Copy link

bf4 commented Apr 29, 2022

@schneems I read through your response about what was going on at the time, and upon reflection, I thought I might know what was going on and went to check.

  • at 2022-04-28T15:36 I see Redis errors in our logs, sidekiq pro reliable push is failing back to local enqueuing
  • at 2022-04-28T15:46 I have an example of a crazy database failure "total size of jsonb array elements exceeds the maximum of 268435455 bytes at character 760"
  • at 2022-04-28T15:46 Sidekiq Superfetch recovered from a crash related to an input to the database failure

So, timing is correlated with this nutso behavior:

Basically, among things, our app tracks trucks picking a thing up at one place, dropping it up at another, driving back to the start, and repeating the cycle. We call the roundtrip time the cycle time and related to it have a metric of tons per cycle.
We build some statistics comparing the planned cycles vs. the actual. We can estimate the total number of planned cycles as "total tons / tons per cycle"

In this (representative error) case:

  • the planned total haul was 2,700 tons.
  • No planned tons per cycle rate was set and no pick up/drop off location was configured which we could use to estimate from travel time, so there was no planned cycle rate. However, there was one load which had been hauled, for 0.2 tons (normal hauls are upwards of 20; all these numbers are unusual), so, we ended up with a very bad 'actual' cycle rate of 0.2 tons/cycle to fill in for the lack of a planned cycle rate.
  • So, the total number of planned cycles was calculated as 2700.0 / 0.02 => 135,000.0 cycles
  • That is a lot. But, the app didn't consider this, and tried to build 135,000.0 over an 8 hour period for this one ludicrous-speed-achieving truck.

Seems the cause was some combination of the dyno ( Performance-M ) running out of memory, or Redis running out of memory, or the database barfing on our GB sized INSERT resulting in the worker crashing.

And maybe we haven't seen a reoccurrence since I added a max number of cycles of 500?

I believe my app crash scenario may explain why the error wasn't rescued in one place, though perhaps not why it was rescued in the other.

@the-spectator
Copy link

the-spectator commented Aug 17, 2022

We are also seeing this issue in our staging environment, If it can be of any help, please find the attached backtrace and config of our system.
ruby: 2.5.1, redis: 4.1.0, sidekiq: 5.2.5, rails: 5.2.3.

backtrace:

IO::EINPROGRESSWaitWritable: Operation in progress - connect(2) would block
  from socket.rb:1213:in `__connect_nonblock'
  from socket.rb:1213:in `connect_nonblock'
  from redis/connection/ruby.rb:180:in `connect_addrinfo'
  from redis/connection/ruby.rb:220:in `block in connect'
  from redis/connection/ruby.rb:218:in `each'
  from redis/connection/ruby.rb:218:in `each_with_index'
  from redis/connection/ruby.rb:218:in `connect'
  from redis/connection/ruby.rb:296:in `connect'
  from redis/client.rb:342:in `establish_connection'
  from redis/client.rb:104:in `block in connect'
  from redis/client.rb:299:in `with_reconnect'
  from redis/client.rb:103:in `connect'
  from redis/client.rb:372:in `ensure_connected'
  from redis/client.rb:224:in `block in process'
  from redis/client.rb:312:in `logging'
  from redis/client.rb:223:in `process'
  from redis/client.rb:123:in `call'
  from redis.rb:2617:in `block in _scan'
  from redis.rb:50:in `block in synchronize'
  from monitor.rb:226:in `mon_synchronize'
  from redis.rb:50:in `synchronize'
  from redis.rb:2616:in `_scan'
  from redis.rb:2754:in `sscan'
  from sidekiq/api.rb:11:in `block in sscan'
  from sidekiq/api.rb:10:in `loop'
  from sidekiq/api.rb:10:in `sscan'
  from sidekiq/api.rb:745:in `block in cleanup'
  from sidekiq.rb:97:in `block in redis'
  from connection_pool.rb:65:in `block (2 levels) in with'
  from connection_pool.rb:64:in `handle_interrupt'
  from connection_pool.rb:64:in `block in with'
  from connection_pool.rb:61:in `handle_interrupt'
  from connection_pool.rb:61:in `with'
  from sidekiq.rb:94:in `redis'
  from sidekiq/api.rb:744:in `cleanup'
  from sidekiq/api.rb:737:in `initialize'
  from sidekiq/scheduled.rb:155:in `new'
  from sidekiq/scheduled.rb:155:in `process_count'
  from sidekiq/scheduled.rb:120:in `random_poll_interval'
  from sidekiq/scheduled.rb:89:in `wait'
  from sidekiq/scheduled.rb:69:in `block in start'
  from sidekiq/util.rb:16:in `watchdog'
  from sidekiq/util.rb:25:in `block in safe_thread'
Redis::TimeoutError: Redis::TimeoutError
  from redis/connection/ruby.rb:183:in `rescue in connect_addrinfo'
  from redis/connection/ruby.rb:179:in `connect_addrinfo'
  from redis/connection/ruby.rb:220:in `block in connect'
  from redis/connection/ruby.rb:218:in `each'
  from redis/connection/ruby.rb:218:in `each_with_index'
  from redis/connection/ruby.rb:218:in `connect'
  from redis/connection/ruby.rb:296:in `connect'
  from redis/client.rb:342:in `establish_connection'
  from redis/client.rb:104:in `block in connect'
  from redis/client.rb:299:in `with_reconnect'
  from redis/client.rb:103:in `connect'
  from redis/client.rb:372:in `ensure_connected'
  from redis/client.rb:224:in `block in process'
  from redis/client.rb:312:in `logging'
  from redis/client.rb:223:in `process'
  from redis/client.rb:123:in `call'
  from redis.rb:2617:in `block in _scan'
  from redis.rb:50:in `block in synchronize'
  from monitor.rb:226:in `mon_synchronize'
  from redis.rb:50:in `synchronize'
  from redis.rb:2616:in `_scan'
  from redis.rb:2754:in `sscan'
  from sidekiq/api.rb:11:in `block in sscan'
  from sidekiq/api.rb:10:in `loop'
  from sidekiq/api.rb:10:in `sscan'
  from sidekiq/api.rb:745:in `block in cleanup'
  from sidekiq.rb:97:in `block in redis'
  from connection_pool.rb:65:in `block (2 levels) in with'
  from connection_pool.rb:64:in `handle_interrupt'
  from connection_pool.rb:64:in `block in with'
  from connection_pool.rb:61:in `handle_interrupt'
  from connection_pool.rb:61:in `with'
  from sidekiq.rb:94:in `redis'
  from sidekiq/api.rb:744:in `cleanup'
  from sidekiq/api.rb:737:in `initialize'
  from sidekiq/scheduled.rb:155:in `new'
  from sidekiq/scheduled.rb:155:in `process_count'
  from sidekiq/scheduled.rb:120:in `random_poll_interval'
  from sidekiq/scheduled.rb:89:in `wait'
  from sidekiq/scheduled.rb:69:in `block in start'
  from sidekiq/util.rb:16:in `watchdog'
  from sidekiq/util.rb:25:in `block in safe_thread'
Redis::CannotConnectError: Error connecting to Redis on app-redis:80 (Redis::TimeoutError)
  from redis/client.rb:353:in `rescue in establish_connection'
  from redis/client.rb:336:in `establish_connection'
  from redis/client.rb:104:in `block in connect'
  from redis/client.rb:299:in `with_reconnect'
  from redis/client.rb:103:in `connect'
  from redis/client.rb:372:in `ensure_connected'
  from redis/client.rb:224:in `block in process'
  from redis/client.rb:312:in `logging'
  from redis/client.rb:223:in `process'
  from redis/client.rb:123:in `call'
  from redis.rb:2617:in `block in _scan'
  from redis.rb:50:in `block in synchronize'
  from monitor.rb:226:in `mon_synchronize'
  from redis.rb:50:in `synchronize'
  from redis.rb:2616:in `_scan'
  from redis.rb:2754:in `sscan'
  from sidekiq/api.rb:11:in `block in sscan'
  from sidekiq/api.rb:10:in `loop'
  from sidekiq/api.rb:10:in `sscan'
  from sidekiq/api.rb:745:in `block in cleanup'
  from sidekiq.rb:97:in `block in redis'
  from connection_pool.rb:65:in `block (2 levels) in with'
  from connection_pool.rb:64:in `handle_interrupt'
  from connection_pool.rb:64:in `block in with'
  from connection_pool.rb:61:in `handle_interrupt'
  from connection_pool.rb:61:in `with'
  from sidekiq.rb:94:in `redis'
  from sidekiq/api.rb:744:in `cleanup'
  from sidekiq/api.rb:737:in `initialize'
  from sidekiq/scheduled.rb:155:in `new'
  from sidekiq/scheduled.rb:155:in `process_count'
  from sidekiq/scheduled.rb:120:in `random_poll_interval'
  from sidekiq/scheduled.rb:89:in `wait'
  from sidekiq/scheduled.rb:69:in `block in start'
  from sidekiq/util.rb:16:in `watchdog'
  from sidekiq/util.rb:25:in `block in safe_thread'

@khaledm1990
Copy link

we are facing similar issue too
ruby: 3.0.1,

redis: 4.7.1,

sidekiq: 6.5.4,

rails: 7.0.3.1

IO::EINPROGRESSWaitWritable: Operation now in progress - connect(2) would block
File "/usr/local/lib/ruby/3.0.0/socket.rb" line 1214 in __connect_nonblock
File "/usr/local/lib/ruby/3.0.0/socket.rb" line 1214 in connect_nonblock
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/connection/ruby.rb" line 158 in connect_addrinfo
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/connection/ruby.rb" line 196 in block in connect
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/connection/ruby.rb" line 194 in each
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/connection/ruby.rb" line 194 in each_with_index
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/connection/ruby.rb" line 194 in connect
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/connection/ruby.rb" line 308 in connect
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 385 in establish_connection

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 115 in block in connect

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 344 in with_reconnect

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 114 in connect

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/prepend.rb" line 20 in block in connect

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 30 in block in connect_with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 45 in block in with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/tracer.rb" line 351 in capture_segment_error

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 45 in with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 30 in connect_with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/prepend.rb" line 20 in connect

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 417 in ensure_connected

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 269 in block in process

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 356 in logging

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 268 in process

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 161 in call

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/prepend.rb" line 12 in block in call

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 19 in block in call_with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 45 in block in with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/tracer.rb" line 351 in capture_segment_error

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 45 in with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 19 in call_with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/prepend.rb" line 12 in call

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 257 in block in call_with_timeout

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 331 in with_socket_timeout

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 256 in call_with_timeout

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis.rb" line 275 in block in send_blocking_command

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis.rb" line 274 in synchronize

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis.rb" line 274 in send_blocking_command

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/commands/lists.rb" line 270 in _bpop

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/commands/lists.rb" line 167 in brpop

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/fetch.rb" line 49 in block in retrieve_work

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq.rb" line 164 in block in redis

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 63 in block (2 levels) in with

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 62 in handle_interrupt

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 62 in block in with

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 59 in handle_interrupt

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 59 in with

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq.rb" line 161 in redis

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/component.rb" line 26 in redis

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/fetch.rb" line 49 in retrieve_work

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/processor.rb" line 83 in get_one

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/processor.rb" line 95 in fetch

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/processor.rb" line 77 in process_one

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/processor.rb" line 68 in run

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/component.rb" line 8 in watchdog

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/component.rb" line 17 in block in safe_thread
Redis::TimeoutError: Redis::TimeoutErr

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/connection/ruby.rb" line 160 in rescue in connect_addrinfo
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/connection/ruby.rb" line 157 in connect_addrinfo
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/connection/ruby.rb" line 196 in block in connect
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/connection/ruby.rb" line 194 in each
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/connection/ruby.rb" line 194 in each_with_index
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/connection/ruby.rb" line 194 in connect
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/connection/ruby.rb" line 308 in connect
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 385 in establish_connection
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 115 in block in connect

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 344 in with_reconnect

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 114 in connect

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/prepend.rb" line 20 in block in connect

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 30 in block in connect_with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 45 in block in with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/tracer.rb" line 351 in capture_segment_error

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 45 in with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 30 in connect_with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/prepend.rb" line 20 in connect

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 417 in ensure_connected

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 269 in block in process

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 356 in logging

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 268 in process

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 161 in call

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/prepend.rb" line 12 in block in call

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 19 in block in call_with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 45 in block in with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/tracer.rb" line 351 in capture_segment_error

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 45 in with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 19 in call_with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/prepend.rb" line 12 in call

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 257 in block in call_with_timeout

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 331 in with_socket_timeout

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 256 in call_with_timeout

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis.rb" line 275 in block in send_blocking_command

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis.rb" line 274 in synchronize

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis.rb" line 274 in send_blocking_command

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/commands/lists.rb" line 270 in _bpop

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/commands/lists.rb" line 167 in brpop

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/fetch.rb" line 49 in block in retrieve_work

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq.rb" line 164 in block in redis

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 63 in block (2 levels) in with

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 62 in handle_interrupt

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 62 in block in with

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 59 in handle_interrupt

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 59 in with

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq.rb" line 161 in redis

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/component.rb" line 26 in redis

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/fetch.rb" line 49 in retrieve_work

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/processor.rb" line 83 in get_one

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/processor.rb" line 95 in fetch

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/processor.rb" line 77 in process_one

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/processor.rb" line 68 in run

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/component.rb" line 8 in watchdog

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/component.rb" line 17 in block in safe_thread
Redis::CannotConnectError: Error connecting to Redis on shared-prod-rd.vcu6d2.ng.0001.euw1.cache.azonaws.com:6379 (Redis::TimeoutError)

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 398 in rescue in establish_connection
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 379 in establish_connection
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 115 in block in connect
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 344 in with_reconnect
File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 114 in connect
File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/prepend.rb" line 20 in block in connect
File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 30 in block in connect_with_tracing
File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 45 in block in with_tracing
File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/tracer.rb" line 351 in capture_segment_error

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 45 in with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 30 in connect_with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/prepend.rb" line 20 in connect

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 417 in ensure_connected

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 269 in block in process

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 356 in logging

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 268 in process

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 161 in call

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/prepend.rb" line 12 in block in call

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 19 in block in call_with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 45 in block in with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/tracer.rb" line 351 in capture_segment_error

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 45 in with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/instrumentation.rb" line 19 in call_with_tracing

File "/usr/local/bundle/gems/newrelic_rpm-8.9.0/lib/new_relic/agent/instrumentation/redis/prepend.rb" line 12 in call

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 257 in block in call_with_timeout

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 331 in with_socket_timeout

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/client.rb" line 256 in call_with_timeout

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis.rb" line 275 in block in send_blocking_command

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis.rb" line 274 in synchronize

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis.rb" line 274 in send_blocking_command

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/commands/lists.rb" line 270 in _bpop

File "/usr/local/bundle/gems/redis-4.7.1/lib/redis/commands/lists.rb" line 167 in brpop

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/fetch.rb" line 49 in block in retrieve_work

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq.rb" line 164 in block in redis

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 63 in block (2 levels) in with

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 62 in handle_interrupt

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 62 in block in with

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 59 in handle_interrupt

File "/usr/local/bundle/gems/connection_pool-2.2.5/lib/connection_pool.rb" line 59 in with

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq.rb" line 161 in redis

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/component.rb" line 26 in redis

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/fetch.rb" line 49 in retrieve_work

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/processor.rb" line 83 in get_one

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/processor.rb" line 95 in fetch

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/processor.rb" line 77 in process_one

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/processor.rb" line 68 in run

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/component.rb" line 8 in watchdog

File "/usr/local/bundle/gems/sidekiq-6.5.4/lib/sidekiq/component.rb" line 17 in block in safe_thre

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants