Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scheduler memory leak on v7 #6624

Closed
andrevmatos opened this issue Oct 1, 2021 · 5 comments
Closed

Scheduler memory leak on v7 #6624

andrevmatos opened this issue Oct 1, 2021 · 5 comments

Comments

@andrevmatos
Copy link

Bug Report

Current Behavior
While investigating redux-observable/redux-observable#754 , it seems the root cause is actually on rxjs@7's scheduler code. It seems that whenever we use an inner observable (mergeMaped) where the outer observable is scheduled or observeOn(queueScheduler), it leaks thousands of QueueActions retained by OperatorSubscriber._teardowns, holding potentially heavy contexts in memory, and aren't freed even after that observable chain is completed and no reference to these variables are kept in the application. On our real world redux-observable app, every emitted action holds with it its context, including redux's state, quickly filling up all available memory.

It seems this also affects asyncScheduler.
This bug seems to NOT happen on rxjs@6 with same code.

This bug looks like #6458 , but differently from that, we create just a few thousand values in memory, and give the opportunity for the queue to flush, and even though it processes normally, the *Actions leak and are never freed. It also happens with lazy and async generators, instead of range. The problem is not the peak memory usage while processing the inner observable, but instead the garbage which accumulates and is never collected after the inner observable completes.

Expected behavior
The scheduled QueueAction objects and their contexts should be freed after the observable completes. Memory usage after the __end is expected to be below 10MB, but e.g. on Node14 it ends above 40MB.

Reproduction

** I got some trouble taking a Heap snapshot on Chrome on the repro code. A workaround is to paste this code in an index.mjs file in a NodeJS 14.x project containing only rxjs@^7.3.0 as dependency, running node --inspect-brk index.mjs and connecting chrome://inspect to it. After code ends, last setTimeout will keep process open, and a Heap snapshot will show 50k+ QueueActions laying in memory and never getting cleared. **

import { from, of, queueScheduler, range, scheduled, Subject } from "rxjs";
import { map, mergeMap, observeOn, subscribeOn } from "rxjs/operators";

function rootEpic(start$) {
  return start$.pipe(
    mergeMap(() =>
      range(0, 50e3).pipe(
        mergeMap(async (c) => "".padStart(c, "a"))
        // tap((a) => console.info('__a', a)),
      )
    )
  );
}

function main() {
  const startSub = new Subject();
  const result$ = of(rootEpic).pipe(
    map((epic) => epic(startSub)),
    // # option 1: redux-observable-like/observeOn
    mergeMap((output$) =>
      from(output$).pipe(subscribeOn(queueScheduler), observeOn(queueScheduler))
    )
    // # option 2: scheduled, bug also present
    // mergeMap((output$) => scheduled(output$, queueScheduler)),
    // # option 3: no explicit scheduler, bug doesn't happen
    // mergeMap((output$) => output$),
  );

  let lastStr = "";
  result$.subscribe((s) => {
    lastStr = s;
  });
  startSub.next(true);

  setTimeout(() => console.info("__end", lastStr.length), 0);
  setTimeout(() => console.info("__exit"), 3600e3);
}

main();

Environment

  • Runtime: NodeJS 14.x, Chrome v94 (although I couldn't take Heap snapshot with it)
  • RxJS version: 7.3.0

Possible Solution

Additional context/Screenshots
Screenshot_20210930_152955

@kwonoj
Copy link
Member

kwonoj commented Oct 1, 2021

we fixed this via #6559 #6562, I found leak regression exactly for this reason (redux-observable). We have to publish new version though.

@benlesh I recall we discussed patch version release for this, maybe worth to do it?

@andrevmatos
Copy link
Author

Woosh, I spent 3 days on your issues tracker, searching for any reference to any issue similar to mines, I can't believe I didn't see those PRs 🤦
Thank you very much for already having looked into it. A patch release for this would be very much appreciated!

@kwonoj
Copy link
Member

kwonoj commented Oct 1, 2021

Yes sorry, I forgot to file an issue but directly headed over to PR. my bad.

@benlesh
Copy link
Member

benlesh commented Oct 1, 2021

@andrevmatos I've just published these fixes as 7.3.1. Give that a try.

@andrevmatos
Copy link
Author

It worked flawlessly! Base memory on the repro above on Node14 some seconds after the end went back to 4.7MB!
Thank you very much @benlesh and @kwonoj for the quick answer and readiness on helping with this. You guys rock!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants