Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak with Twisted 24.3.0 #12120

Open
vanand123 opened this issue Mar 21, 2024 · 9 comments
Open

Memory leak with Twisted 24.3.0 #12120

vanand123 opened this issue Mar 21, 2024 · 9 comments
Labels

Comments

@vanand123
Copy link

Describe the incorrect behavior you saw
My REST application's memory keeps increasing indefinitely with Twisted 24.3.0.

Describe how to cause this behavior

What did you do to get it to happen?
With the rest load, my rest application's memory keeps increasing.

Does it happen every time you follow these steps, sometimes, or only one time?
This happens every time.
After further investigation, I discovered that my application memory was fine with Twisted 21.2.0.
This issue started occurring with Twisted 21.7.0rc1 onwards.

Describe the correct behavior you'd like to see
With the rest load, memory of application should not increase indefinitely.

Testing environment

~ # uname -a
Linux VMX157-009 6.7.4 #1 SMP PREEMPT_DYNAMIC Wed Mar 20 11:06:54 GMT 2024 x86_64 GNU/Linux
~ # dpkg -l | grep twisted
ii  python-twisted           24.3.0-300000                               amd64        python-twisted
~ #
@vanand123 vanand123 added the bug label Mar 21, 2024
@adiroiban
Copy link
Member

Thanks Vivek for the report.

Can you please share the code of your application that is triggering the memory leak?

A short, self contained example would help a lot.

Regards

@glyph
Copy link
Member

glyph commented Mar 27, 2024

It's quite possible that this is a bug in Twisted but this is not enough information to act on it; it's possible that it's just your application, or perhaps a patch that Debian applied, given that it appears to be installed with dpkg rather than pip. We'd love to fix it, so please do resubmit if you can narrow it down to a reproducer!

@glyph glyph closed this as completed Mar 27, 2024
@vanand123
Copy link
Author

It would be a bit difficult to provide self contained example as my application code is tightly coupled and it will take a lot of effort.
I tried to bisect the commit. Observation is as below:

  1. pointed twisted branch to 9c192ca commit : No issue observed
  2. pointed twisted branch to 64506ce commit : Memory leak observed

Let me know if this would help.

@adiroiban
Copy link
Member

Is this commit working for you ?

df9d19c

If you can pinpoint the commit in trunk that introduced the error, that can help.

From what I cna see 64506ce is just a mypy typing change.

@glyph
Copy link
Member

glyph commented Apr 17, 2024

From what I cna see 64506ce is just a mypy typing change.

It's possible that the Deferred debugging / Failure cleaning logic was unintentionally changed subtly, because that can depend intimately on the details of garbage collection, which is hard to have a test for. That might be a place to investigate in the application being discussed, i.e., does it have a bunch of circular and/or uncollectable references which might be generating cycles in the face of some subtle change there?

Another thing to check in this general area of hypotheses is, "is there anything ingc.garbage"? This is unlikely but if you are using an old custom C extension it might be the issue.

@vanand123
Copy link
Author

vanand123 commented Apr 19, 2024

Thanks @adiroiban for pointing out df9d19c merge commit.

Pointed trunk to 64506ce (where the memory leak was observed for me) and then reverted the changes of df9d19c and with this I'm not seeing memory leak.
So, it seems some changes in df9d19c is leading to memory leak.
Hope it will help.

@adiroiban
Copy link
Member

Perfect. So the regression was introduced in df9d19c

I think that we have a better starting point for troubleshooting :)

@adiroiban adiroiban reopened this Apr 19, 2024
@vanand123
Copy link
Author

Is there any update on this?

-Thanks

@glyph
Copy link
Member

glyph commented May 6, 2024

@vanand123 If anyone has updates, they will surely post them here.

In the meanwhile, a minimal reproducer would go a long way to make this feasible to investigate more easily.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants