Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chore: Increase Mocha timeout for copying fixtures #13768

Merged
merged 1 commit into from Oct 20, 2020
Merged

Conversation

btmills
Copy link
Member

@btmills btmills commented Oct 18, 2020

Prerequisites checklist

What is the purpose of this pull request? (put an "X" next to an item)

[ ] Documentation update
[ ] Bug fix (template)
[ ] New rule (template)
[ ] Changes an existing rule (template)
[ ] Add autofixing to a rule
[ ] Add a CLI option
[ ] Add something to the core
[x] Other, please explain: Fix GitHub Actions test flakiness

What changes did you make? (Give an overview)

For the last several weeks, our CI jobs have been flaky due to test hook timeouts on the Windows and macOS runners. I traced most of the failures to four before hooks that copy the test fixtures to a temporary directory. They usually take ~2.5s, but that occasionally spikes to tens of seconds: in the two times I was able to repro the timeouts, the slowest copy jobs were 25 (output) and 32 seconds.

Is there anything you'd like reviewers to focus on?

  • Rather than increasing the global test timeout, I instead bumped it just for these four hooks that were causing problems. Is the narrower approach acceptable?
  • The previous timeout was 10 seconds. Is 60 seconds for these four hooks too much? In 8 samples (2 successful repros × 4 hooks), the longest I saw was 32 seconds.
  • Since these hooks just call out to the shell, I didn't try to do any optimization that might speed up these operations. Is there something easy I missed? My guess is that resource contention on shared runners is causing slow filesystem performance, but I haven't proven that's the original cause.

For the last several weeks, our CI jobs have been flaky due to test
hook timeouts on the Windows and macOS runners. I traced most of the
failures to four `before` hooks that copy the test fixtures to a
temporary directory. They usually take ~2.5s, but that occasionally
spikes to tens of seconds: in the two times I was able to repro the
timeouts, the slowest copy jobs were 25 and 32 seconds. Rather than
increasing the global test timeout, I instead bumped it just for these
four hooks that were causing problems.
@btmills btmills added accepted There is consensus among the team that this change meets the criteria for inclusion chore This change is not user-facing labels Oct 18, 2020
@mdjermanovic
Copy link
Member

  • Rather than increasing the global test timeout, I instead bumped it just for these four hooks that were causing problems. Is the narrower approach acceptable?

👍 I think this is a good idea since it increases the timeout only for operations that don't involve ESLint in any way. Increasing the global test timeout could hide performance issues with ESLint itself.

Copy link
Member

@mdjermanovic mdjermanovic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@mdjermanovic mdjermanovic merged commit 84fd591 into master Oct 20, 2020
@mdjermanovic mdjermanovic deleted the macos-timeouts branch October 20, 2020 13:04
@eslint-github-bot eslint-github-bot bot locked and limited conversation to collaborators Apr 19, 2021
@eslint-github-bot eslint-github-bot bot added the archived due to age This issue has been archived; please open a new issue for any further discussion label Apr 19, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
accepted There is consensus among the team that this change meets the criteria for inclusion archived due to age This issue has been archived; please open a new issue for any further discussion chore This change is not user-facing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants