Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Module Import Stack Trace sometimes broken #13248

Closed
smcenlly opened this issue Sep 12, 2022 · 9 comments · Fixed by #13245
Closed

[Bug]: Module Import Stack Trace sometimes broken #13248

smcenlly opened this issue Sep 12, 2022 · 9 comments · Fixed by #13245

Comments

@smcenlly
Copy link

Version

29.0.3

Steps to reproduce

  1. Clone this repo: git clone https://github.com/smcenlly/jest-stack-trace-issue
  2. Install dependencies npm install
  3. From the CLI, run tests: npm run test
  4. See error and stack trace initially reported:
 FAIL  ./a.spec.js
  ✕ should correctly report stack trace (8 ms)

  ● should correctly report stack trace

    Cannot find module './my-missing-import' from 'a.spec.js'

      1 | it('should correctly report stack trace', () => {
    > 2 |   require('./my-missing-import');
        |   ^
      3 | });

      at Resolver._throwModNotFoundError (node_modules/jest-resolve/build/resolver.js:487:11)
      at Object.require (a.spec.js:2:3)

Test Suites: 1 failed, 1 total
Tests:       1 failed, 1 total
Snapshots:   0 total
Time:        0.205 s, estimated 1 s
Ran all test suites.

Watch Usage: Press w to show more.
  1. Add spaces at the end of line 2 (the missing import), eventually (after adding a number of spaces) the missing import is no longer reported correctly and will display as shown below:
 FAIL  ./a.spec.js
  ✕ should correctly report stack trace (1 ms)

  ● should correctly report stack trace

    Cannot find module './my-missing-import' from 'a.spec.js'



Test Suites: 1 failed, 1 total
Tests:       1 failed, 1 total
Snapshots:   0 total
Time:        0.136 s, estimated 1 s
Ran all test suites.

Watch Usage: Press w to show more.

Expected behavior

Described in steps to reproduce.

Actual behavior

Described in steps to reproduce.

Additional context

We have some automated integration tests as a part of our product (https://wallabjs.com) that test stack trace mapping for jest and identified this regression in the latest version of jest. I believe it is a result of @connectdotz commit: dbda13f.

Environment

System:
    OS: Linux 5.15 Ubuntu 22.04.1 LTS 22.04.1 LTS (Jammy Jellyfish)
    CPU: (12) x64 Intel(R) Core(TM) i7-8700K CPU @ 3.70GHz
  Binaries:
    Node: 16.15.1 - ~/.nvm/versions/node/v16.15.1/bin/node
    Yarn: 1.22.19 - ~/.nvm/versions/node/v16.15.1/bin/yarn
    npm: 8.11.0 - ~/.nvm/versions/node/v16.15.1/bin/npm
  npmPackages:
    jest: ^29.0.3 => 29.0.3

and

  System:
    OS: macOS 12.5.1
    CPU: (8) arm64 Apple M1
  Binaries:
    Node: 16.15.1 - ~/.nvm/versions/node/v16.15.1/bin/node
    Yarn: 1.22.19 - ~/.nvm/versions/node/v16.15.1/bin/yarn
    npm: 8.11.0 - ~/.nvm/versions/node/v16.15.1/bin/npm
  npmPackages:
    jest: ^29.0.3 => 29.0.3
@SimenB
Copy link
Member

SimenB commented Sep 12, 2022

Could you put together a repository showing the regression?

EDIT: Never mind, I'm blind - missed the link as it was in backtick 🙈

@SimenB
Copy link
Member

SimenB commented Sep 12, 2022

This only happens with @babel/core@7.19.0 - forcing it to be @babel/core@7.18.13 makes it work properly.

SImilarly, just adding transform: {} to the config works as it disables babel-jest. See #13245 which shows kinda the same error in this repo.

So I'm guessing something about how @babel/core@7.19.0 produces sourcemaps is broken.


To test, apply this diff

diff --git i/package.json w/package.json
index 6a6a596..f5615a2 100644
--- i/package.json
+++ w/package.json
@@ -11,5 +11,8 @@
   "license": "ISC",
   "dependencies": {
     "jest": "^29.0.3"
+  },
+  "resolutions": {
+    "@babel/core": "7.18.13"
   }
 }

(I wasn't able to get overrides to work due to some peer dep stuff, so this is using Yarn instead of npm)

@smcenlly
Copy link
Author

Thanks for the fast reply. I take a look at babel source code and issues tomorrow and see if they have a report for this. A fresh start seems to work, so it may be due to some internal babel caching I guess.

@SimenB
Copy link
Member

SimenB commented Sep 12, 2022

Running Jest with --no-cache fails every time

@smcenlly
Copy link
Author

I believe I've found the cause of the problem.

The relatively recent rewrite-stack-trace.ts as you had assumed in #13245 is related.

In rewrite-stack-trace.ts, the Error.stackTraceLimit is being adjusted in their singleton function. This runs just once within a process.

What's happening is as follows:

  1. Jest starts and sets Error.stackTraceLimit to 100.
  2. At some point the babel rewrite code runs and sets Error.stackTraceLimit to Error.stackTraceLimit += STACK_TRACE_LIMIT_DELTA; (const STACK_TRACE_LIMIT_DELTA = 100;).
  3. When errors are reported on first run, they are truncated to newTrace.slice(0, Error.stackTraceLimit - STACK_TRACE_LIMIT_DELTA), (see here).
  4. When a process is re-used, jest re-initializes the Error.stackTraceLimit to 100.
  5. Because rewrite-stack-trace.ts operates as a singleton, it is not expecting Error.stackTraceLimit to have been reset, and then completely truncates subsequent stack traces because now Error.stackTraceLimit is 100, and STACK_TRACE_LIMIT_DELTA is 100, and the processing code newTrace.slice(0, Error.stackTraceLimit - STACK_TRACE_LIMIT_DELTA) becomes newTrace.slice(0, 100 - 100).

You could probably work around the problem with a change to how jest is setting/updating Error.stackTraceLimit but I understand if you wouldn't want to do this.

Hopefully this gives you and the babel team enough information to identify how/where to fix the problem.

@SimenB
Copy link
Member

SimenB commented Sep 13, 2022

Ah, that makes sense, thanks for digging! Maybe open an issue in the Babel repo about it? I don't think what Jest does is unreasonable.

@smcenlly
Copy link
Author

Sure thing, will do it tomorrow (my time). Will link and share the issue here after I create it.

@smcenlly
Copy link
Author

Have raised the issue with the babel team (link above).

@SimenB SimenB linked a pull request Sep 14, 2022 that will close this issue
@github-actions
Copy link

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Oct 15, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants