New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Goroutine Leaks in NewDockerCompose #2008
Comments
Thanks @mhogara for raising this issue. Would you be interested in contributing a fix? |
👍🏻 If I have the time, I'll start work on a fix @mdelapenya - will update this issue if I make any progress or have any questions. |
hey @mhogara I'm able to reproduce this issue adding the goleak dependency to a test in the compose module (not in the LocalDockerCompose, which is deprecated and will eventually be removed): func TestDockerComposeAPIWithVolume(t *testing.T) {
defer goleak.VerifyNone(t)
path := RenderComposeWithVolume(t)
compose, err := NewDockerCompose(path)
require.NoError(t, err, "NewDockerCompose()")
t.Cleanup(func() {
require.NoError(t, compose.Down(context.Background(), RemoveOrphans(true), RemoveImagesLocal), "compose.Down()")
})
ctx, cancel := context.WithCancel(context.Background())
t.Cleanup(cancel)
err = compose.Up(ctx, Wait(true))
require.NoError(t, err, "compose.Up()")
} with the result you described:
From that report, I can only say that the leak comes from another package, not from testcontainers-go. I've even run all the tests in the func TestMain(m *testing.M) {
defer goleak.VerifyTestMain(m)
} the goleak report for the entire execution shows that no testcontainers-go code is in use:
|
@mhogara if you agree, we should close this one as my guess is that it's not caused by us. Wdyt? |
Hi @mdelapenya I'd prefer to keep this issue open as long as the go leak is still occuring in the code though and we have no solid solution; this leak does require a workaround/change in my code, so to have it closed would make it harder for me to track when/how long I need this workaround to be present. I would also be interested in which package exactly this leak is coming from 🤔 - I'm not sure how much investigation that would take. If we could pinpoint the package that causes this, we should make an issue for them to track it and further investigate. I was also thinking over your hypothesis about an HTTP connection causing the leak and found this SO post: https://stackoverflow.com/questions/75026700/goroutine-leaks-when-trying-to-readallresponse-body-where-response-is-retu. That feels promising, I'm not sure if you've seen it. After thinking it through, this could stop the leak if it coming from the polling for the wait strategy(assuming there's an internal client?). I'd be happy to look into this possible solution, but it might take a while for me to have enough time to get to it. |
Testcontainers version
0.26.0
Using the latest Testcontainers version?
Yes
Host OS
Linux
Host arch
x86_64
Go version
1.21.3
Docker version
Docker info
What happened?
This is somewhat of a follow up to #2007 (possibly same issue, but I'll let the team here reorganize as needed). I did some searching in the current issues and could not find anything on go leaks.
Problem at Hand
Running a simple test that takes up a
NewDockerCompose
fromtestcontainers-go/modules/compose
leaks 2 goroutines (as caught bygoleak.VerifyNone(t)
. There are 2 leaks with a wait strategy; without a wait strategy, 4 go routines are leaking. The container or docker compose file does not affect anything.Interestingly enough, these problems do not happen if you use the deprecated
NewLocalDockerCompose
.I'll include reproduction code below and under "Relevant log output", I'll put the two goroutine leaks you get when you have a wait strategy.
What I've Observed/Debugging
It seems to be like 2 of the goroutines come from the creation of the Docker compose and another 2 come from some sort of waiting that happens after. From debugging, it seems like some of those leaks could come from calls to
NewClientHTTPS
orNewDockerClientWithOpts
.Let me know if I can share anything else to help.
Relevant log output
Additional information
Here's some code that causes the 2 goroutine leaks(docker compose file/wait strategy can be anything, I'm using a Redis container):
To see the case where there are 4 leaking goroutines, you just need to comment out the "WaitForService" line.
Here's the code that has no goroutine leaks:
The text was updated successfully, but these errors were encountered: