New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
upstream memory leak #373
Comments
@lluu131, hello and thanks for the thorough report. Unfortunately, we can't reproduce the leak. It would really help us to troubleshoot this issue if you could collect a goroutines profile for us. To perform that, restart the curl "http://127.0.0.1:6060/debug/pprof/goroutine?debug=1" > profile.txt Or just follow the "http://127.0.0.1:6060/debug/pprof/goroutine?debug=1" URL with your web browser. Note that profiles could only be accessed on the same host machine. You can send the resulting profile to our devteam@adguard.com. |
This comment was marked as outdated.
This comment was marked as outdated.
@EugeneOne1 EugeneOne1 Profile.txt has been sent by e-mail |
Profile.txt has been sent by e-mail |
Updates #373. Squashed commit of the following: commit 0632b4f Author: Eugene Burkov <E.Burkov@AdGuard.COM> Date: Tue Jan 23 16:21:41 2024 +0300 upstream: imp code, logging commit cea34d5 Author: Eugene Burkov <E.Burkov@AdGuard.COM> Date: Tue Jan 23 15:50:53 2024 +0300 upstream: use mutex. imp logging
@lluu131, hello again. Thank you for your help, the profile clarified the issue for us. We've pushed the patch ( If the issue persists, wouldn't you mind to collect the profile again? We'd also like to take a look at the verbose log ( |
Already done with client and server updates, I noticed from verbose that the client is requesting root dns every second, is this normal?? |
Tested for a few hours, memory increases after quic upstream interruptions, memory stops increasing after upstream resumes (but won't be freed), some improvement compared to the previous constant increase, but there is still a problem, the relevant logs were sent via email |
@lluu131, we've received the data. Thank you for your help. |
@lluu131, we've been investigating some unusual concurrency patterns used in the DNS-over-QUIC code, and found that the dependency responsible for handling QUIC protocol probably contains the bug (quic-go/quic-go#4303). Anyway, we should come up with some workaround in the meantime. |
cost 10G after running 66 day this machine only run all my dns server. the config is :
|
I've observed a memory leak issue in my home environment. I am using the docker version of adguard/dnsproxy. |
Update:There are many query errors in my log. It seems that when an upstream query error occurs (such as the network is temporarily unavailable), the memory will increase until out of memory. |
QUIC upstream
UDP upstream
For the same configuration, the memory footprint of QUIC upstream is very high and constantly increasing, but UDP upstream is very low, both without caching
The text was updated successfully, but these errors were encountered: