You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[Note: I would have preferred to use a canned example like route_guide.proto here, but the stream-stream RouteChat RPC is just not conducive to this example.
Instead, I defined an "EchoService" with a single simple stream-stream RPC]
If you use the interceptor, you observe the memory leaks.
If you remove the interceptor, there is no apparent leak.
Once the stream-stream call terminates, it appears that all of these objects are finally released and collected. So the leak persists only for the duration of the RPC.
Observe that the process memory RSS/VSZ grows unbounded.
If you like, inject some code into the client to observe what objects are accumulating.
I used the pympler lib in the example code (e.g. here)
Observe that if you remove the interceptor and run the client.py, the process memory RSS/VSZ remains constant.
Using pympler as above does not show large numbers of uncollected objects.
If you don't want to click through the links above, here is the full client.py code. (The other files are really unremarkable.)
importasyncioimportloggingimportgrpc.aioimportecho_pb2importecho_pb2_grpcclassInterceptor(grpc.aio.StreamStreamClientInterceptor):
asyncdefintercept_stream_stream(
self, continuation, call_details, request_iterator):
returnawaitcontinuation(call_details, request_iterator)
asyncdefmain():
# If you remove the interceptor, there does not appear to be any leakchannel=grpc.aio.insecure_channel('[::]:50051', interceptors=[Interceptor()])
stream=echo_pb2_grpc.EchoServiceStub(channel).Echo()
# start tasks to send / receive on the streamtasks= [asyncio.create_task(send(stream)),
asyncio.create_task(recv(stream))]
# Add this task to log the top objects in memory using pympler library.# Demonstrates which objects are not being collected.# tasks.append(asyncio.create_task(log_object_summary(interval=30.0)))awaitasyncio.gather(*tasks)
asyncdefsend(stream: grpc.aio.StreamStreamCall):
awaitstream.wait_for_connection()
forninrange(0, 1_000_000):
awaitasyncio.sleep(0.001)
awaitstream.write(echo_pb2.EchoRequest(
message=f"message: {n}"
))
awaitstream.done_writing()
asyncdefrecv(stream: grpc.aio.StreamStreamCall):
awaitstream.wait_for_connection()
asyncforresponseinstream:
passasyncdeflog_object_summary(interval: float):
frompymplerimportmuppy, summarywhileTrue:
awaitasyncio.sleep(interval)
lines=summary.format_(summary.summarize(muppy.get_objects()), limit=20)
logging.info('top objects:\n%s', '\n'.join(lines))
if__name__=='__main__':
logging.basicConfig(level='INFO')
asyncio.run(main())
What did you expect to see?
No memory leaks!
What did you see instead?
Memory leaks!
Specifically, you will observe a buildup of these object types, with a clear correlation between them.
Note: once the stream-stream call terminates, it appears that all of these objects are finally collected. So the leak persists only for the duration of the RPC.
Anything else we should know about your project / environment?
* Don't leave pending tasks on the asyncio queue
The results of these pending tasks are not needed, leaving them
on the queue grows the size of the queue until the call completes.
This fix slows the growth of the memory in the test example.
* Address 'leaking' Futures from cygrpc
Cancelling unneeded Tasks is not sufficient as this leaves behind
cancelled Futures in the cygrpc layer, which still occupy memory.
Instead, avoid creating unneeded tasks in the first place.
* Address review comments
1. Ignore unused return values
2. Fix formatting
What version of gRPC and what language are you using?
Language: python
gRPC:
grpcio==1.35.0
What operating system (Linux, Windows,...) and version?
Linux (Fedora 32)
What runtime / compiler are you using (e.g. python version or version of gcc)
python 3.8.7
What did you do?
In the
aio
implementation, I have observed that long-lived stream-stream RPCs leak memory -- but only when an interceptor is also used.Example code is posted to this repo: https://github.com/e-heller/grpc-python-aio-memory-leak
[Note: I would have preferred to use a canned example like
route_guide.proto
here, but the stream-stream RouteChat RPC is just not conducive to this example.Instead, I defined an "EchoService" with a single simple stream-stream RPC]
The example uses this no-op interceptor:
Once the stream-stream call terminates, it appears that all of these objects are finally released and collected. So the leak persists only for the duration of the RPC.
How to reproduce?
Compile the echo.proto file
Start the server.py
Run the client.py
Observe that the process memory RSS/VSZ grows unbounded.
If you like, inject some code into the client to observe what objects are accumulating.
I used the pympler lib in the example code (e.g. here)
Observe that if you remove the interceptor and run the
client.py
, the process memory RSS/VSZ remains constant.Using
pympler
as above does not show large numbers of uncollected objects.If you don't want to click through the links above, here is the full
client.py
code. (The other files are really unremarkable.)What did you expect to see?
No memory leaks!
What did you see instead?
Memory leaks!
Specifically, you will observe a buildup of these object types, with a clear correlation between them.
These objects are not being garbage collected, indicating some kind of leak in the grpc library. The process RSS/VSZ grows without bound as well.
One way to see the uncollected objects building up in memory is to use a library like pympler and print out an object summary periodically.
For example, I injected this as a
Task
in my client code:After a short while, it will start logging reports like this:
Note: once the stream-stream call terminates, it appears that all of these objects are finally collected. So the leak persists only for the duration of the RPC.
Anything else we should know about your project / environment?
All files are also in the attached ZIP:
grpc-python-aio-memory-leak.zip
The text was updated successfully, but these errors were encountered: