Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Potential OOM error #338

Closed
canonic-epicure opened this issue Dec 22, 2021 · 3 comments · Fixed by #469
Closed

Potential OOM error #338

canonic-epicure opened this issue Dec 22, 2021 · 3 comments · Fixed by #469
Labels

Comments

@canonic-epicure
Copy link

I just noticed that c8 will probably suffer from the same OOM issue as nyc: istanbuljs/nyc#1263

It is caused by the fact, that it tries to load all reports in memory first and then merge them. Instead, should load and merge one-by-one (or by certain limit).

jan-molak added a commit to serenity-js/serenity-js that referenced this issue May 2, 2023
It seems like C8 is suffering from the same memory issue NYC used to suffer from

Related tickets: bcoe/c8#338 istanbuljs/nyc#1263
@jan-molak
Copy link

jan-molak commented May 2, 2023

I can reproduce this OOM error in an open-source project.

It seems like the issue is caused by the number (464) and size (0.5MB-8MB each) of tmp files required to generate the coverage report and the fact that C8 tries to load them all into memory before processing (as per @canonic-epicure's point above).

I'm happy to share the output needed to reproduce the problem; it's ~300 MB, so I can't attach it to the ticket.

> c8 report --reporter=html --reporter=lcov --reporter=text-summary --temp-directory=./target/coverage/tmp --report-dir=./target/coverage


<--- Last few GCs --->

[1912:0x733f090]    31169 ms: Scavenge 2015.2 (2063.6) -> 2014.3 (2062.9) MB, 14.1 / 0.0 ms  (average mu = 0.314, current mu = 0.304) allocation failure; 
[1912:0x733f090]    31539 ms: Scavenge (reduce) 2030.6 (2074.5) -> 2031.0 (2070.2) MB, 12.3 / 0.0 ms  (average mu = 0.314, current mu = 0.304) allocation failure; 
[1912:0x733f090]    31569 ms: Scavenge (reduce) 2033.6 (2070.4) -> 2033.2 (2071.9) MB, 18.9 / 0.0 ms  (average mu = 0.314, current mu = 0.304) allocation failure; 


<--- JS stacktrace --->

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 0xb7a940 node::Abort() [node]
 2: 0xa8e823  [node]
 3: 0xd5c940 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [node]
 4: 0xd5cce7 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [node]
 5: 0xf3a3e5  [node]
 6: 0xf3b2e8 v8::internal::Heap::RecomputeLimits(v8::internal::GarbageCollector) [node]
 7: 0xf4b7f3  [node]
 8: 0xf4c668 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [node]
 9: 0xf26fce v8::internal::HeapAllocator::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [node]
10: 0xf28397 v8::internal::HeapAllocator::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [node]
11: 0xf08d92 v8::internal::Factory::AllocateRawWithAllocationSite(v8::internal::Handle<v8::internal::Map>, v8::internal::AllocationType, v8::internal::Handle<v8::internal::AllocationSite>) [node]
12: 0xf135ec v8::internal::Factory::NewJSObjectFromMap(v8::internal::Handle<v8::internal::Map>, v8::internal::AllocationType, v8::internal::Handle<v8::internal::AllocationSite>) [node]
13: 0xf13c65 v8::internal::Factory::NewJSArrayWithUnverifiedElements(v8::internal::Handle<v8::internal::FixedArrayBase>, v8::internal::ElementsKind, int, v8::internal::AllocationType) [node]
14: 0xf13ed2 v8::internal::Factory::NewJSArray(v8::internal::ElementsKind, int, int, v8::internal::ArrayStorageAllocationMode, v8::internal::AllocationType) [node]
15: 0x103c6b9 v8::internal::JsonParser<unsigned short>::BuildJsonArray(v8::internal::JsonParser<unsigned short>::JsonContinuation const&, v8::base::SmallVector<v8::internal::Handle<v8::internal::Object>, 16ul, std::allocator<v8::internal::Handle<v8::internal::Object> > > const&) [node]
16: 0x10[442](https://github.com/serenity-js/serenity-js/actions/runs/4863164031/jobs/8670855130#step:8:443)7e v8::internal::JsonParser<unsigned short>::ParseJsonValue() [node]
17: 0x10[451](https://github.com/serenity-js/serenity-js/actions/runs/4863164031/jobs/8670855130#step:8:452)5f v8::internal::JsonParser<unsigned short>::ParseJson() [node]
18: 0xde2523 v8::internal::Builtin_JsonParse(int, unsigned long*, v8::internal::Isolate*) [node]
19: 0x16fb7b9  [node]
Aborted (core dumped)
make: *** [Makefile:76: report] Error 134

I believe that the large number of tmp files produced by C8 is caused by C8 producing a tmp file per process. I have over a dozen components (Node modules) in my project, many of which run integration tests using Mocha with --parallel flag. Those tests spin off child processes running the system under test.
The number of resulting tmp files looks reasonable given the project size and proportional to the number of processes I'd expect to see.

The report aggregation command (c8 report) shown in the listing above works fine on a machine with 4GB allocated to the Node.js process but fails on GitHub Actions running with default settings. Configuring Node to use 4GB of RAM works around the problem seen on GitHub Actions:

node --max-old-space-size=4096 node_modules/.bin/c8 report --reporter=html --reporter=lcov --reporter=text-summary --temp-directory=./target/coverage/tmp --report-dir=./target/coverage

Proposed solution

@bcoe - would it make sense to make the _loadReports method:

c8/lib/report.js

Lines 246 to 259 in 2f36fe9

_loadReports () {
const reports = []
for (const file of readdirSync(this.tempDirectory)) {
try {
reports.push(JSON.parse(readFileSync(
resolve(this.tempDirectory, file),
'utf8'
)))
} catch (err) {
debuglog(`${err.stack}`)
}
}
return reports
}

cache each entry using the same strategy _getMergedProcessCov uses?

c8/lib/report.js

Lines 177 to 188 in 2f36fe9

_getMergedProcessCov () {
const { mergeProcessCovs } = require('@bcoe/v8-coverage')
const v8ProcessCovs = []
const fileIndex = new Set() // Set<string>
for (const v8ProcessCov of this._loadReports()) {
if (this._isCoverageObject(v8ProcessCov)) {
if (v8ProcessCov['source-map-cache']) {
Object.assign(this.sourceMapCache, this._normalizeSourceMapCache(v8ProcessCov['source-map-cache']))
}
v8ProcessCovs.push(this._normalizeProcessCov(v8ProcessCov, fileIndex))
}
}

So something like this, perhaps:

  _loadReports () {
    const { mergeProcessCovs } = require('@bcoe/v8-coverage')
  
    try {
      const v8ProcessCovs = []
      for (const file of readdirSync(this.tempDirectory)) {
        const v8ProcessCov = JSON.parse(readFileSync(
          resolve(this.tempDirectory, file),
          'utf8'
        ))
        
        if (this._isCoverageObject(v8ProcessCov)) {
          if (v8ProcessCov['source-map-cache']) {
            Object.assign(this.sourceMapCache, this._normalizeSourceMapCache(v8ProcessCov['source-map-cache']))
          }
          v8ProcessCovs.push(this._normalizeProcessCov(v8ProcessCov, fileIndex))
        }
        
        return v8ProcessCovs;
      } 
    } catch (error) {
      debuglog(`${err.stack}`)
      throw error;
    }
  }

I'm happy to propose a PR if you agree with the approach, @bcoe?

@bizob2828
Copy link
Contributor

@jan-molak do you think you can test my PR #469? We were having the same issues and after applying my fix we are not.

jan-molak added a commit to serenity-js/serenity-js that referenced this issue May 5, 2023
It seems like C8 is suffering from the same memory issue NYC used to suffer from

Related tickets: bcoe/c8#338 istanbuljs/nyc#1263
@bcoe bcoe closed this as completed in #469 May 26, 2023
jan-molak added a commit to serenity-js/serenity-js that referenced this issue Aug 1, 2023
…by c8

enabling the --merge-async flag helps to avoid c8 running out of memory, see bcoe/c8#338
@jan-molak
Copy link

@bizob2828 - apologies for not getting back to you sooner, and thanks for your work on #469.
Yes, I can confirm it resolves the issue for Serenity/JS 👍🏻

Abe27342 added a commit to microsoft/FluidFramework that referenced this issue Dec 19, 2023
…18910)

## Description

This should resolve the intermittent OOM issues we've been seeing on CI.
The cause of them appears was [this
issue](bcoe/c8#338), as by default c8 attempts
to merge coverage reports by loading them all into memory at the same
time. Our root c8 config includes many files, so this can trigger OOM
issues.

I've updated the repo to c8@8.0.1, since we needed to bump the c8
dependency anyway to use the merge-async option and the only breaking
change between 7 and 8 was dropping node 10 support.

---------

Co-authored-by: Abram Sanderson <absander@microsoft.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
4 participants