Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Marks failing commands as having succeeded if they have too much output (SIGPIPE) #76

Closed
LukeShu opened this issue Jun 14, 2022 · 2 comments
Assignees

Comments

@LukeShu
Copy link

LukeShu commented Jun 14, 2022

Describe the bug
If a command has too much output, then even though the command might fail, the step gets marked as passing.

I measured "too much" as ≥1026 KiB, but I wouldn't be surprised if it's timing related and varies at any value much over 1024.

I've created a reproducer over at https://github.com/LukeShu/gha-test where each step always() runs and dumps an increasing amount of output to stdout and then exits with a failure. As seen in the screenshot below, everything <=1MiB correctly fails, but then everything >=1⅛MiB starts passing.

This can be replicated locally, as well:

(All you need to know about my gha-test reproducer is that make -C ../gha-test bytes-1152 spits out 1152KiB of <=80-character lines on stdout, and then exits with code 1.)

$ INPUT_CONTINUE_ON_ERROR=false INPUT_MAX_ATTEMPTS=1 INPUT_TIMEOUT_MINUTES=5 INPUT_COMMAND='make -C ../gha-test bytes-1152' node ./dist/index.js

9:  648 aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
a:  729 aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
b:  810 aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
c:  891 aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
d:  972 aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
::debug::Code: null
::debug::Signal: SIGPIPE
Command completed after 1 attempt(s).

::set-output name=exit_code::0

I originally encountered this with v2.4.0, but the local-reproduction above is with the latest version (v2.7.0).

Expected behavior
I expected that if my tests fail, then my CI will fail.

Screenshots
2022-06-14-164123_261x599_scrot

Logs
Enable debug logging then attach the raw logs (specifically the raw output of this action).

@LukeShu LukeShu changed the title Marks failing commands as having succeeded if they have too much output Marks failing commands as having succeeded if they have too much output (SIGPIPE) Jun 14, 2022
LukeShu added a commit to emissary-ingress/emissary that referenced this issue Jun 14, 2022
Besides test retries being bad (they're there for the case that the test
intermittently flakes, but mask the case that the code-under-test only
intermittently works), this is important in order to remove the
`nick-invision/retry@v2.4.0` GitHub action.  It has a bug where if there
is too much output it marks the test as passing even if it fails.  That is
very bad, and we cannot permit that type of failure mode in our CI.

nick-fields/retry#76

Signed-off-by: Luke Shumaker <lukeshu@datawire.io>
bathina2 pushed a commit to bathina2/emissary that referenced this issue Jul 1, 2022
Besides test retries being bad (they're there for the case that the test
intermittently flakes, but mask the case that the code-under-test only
intermittently works), this is important in order to remove the
`nick-invision/retry@v2.4.0` GitHub action.  It has a bug where if there
is too much output it marks the test as passing even if it fails.  That is
very bad, and we cannot permit that type of failure mode in our CI.

nick-fields/retry#76

Signed-off-by: Luke Shumaker <lukeshu@datawire.io>
Signed-off-by: Sirish Bathina <sirish@kasten.io>
@tstackhouse
Copy link

tstackhouse commented Aug 3, 2022

Doing some research on this, it looks like this could be connected with the maxBuffer option of child_process.exec(), which defaults to 1024 * 1024 bytes, see the NodeJS docs: https://nodejs.org/api/child_process.html#child_processexeccommand-options-callback

Doing a bit more research, is there a reason to not use child_process.spawn which streams output to the parent process rather than buffering it? See this S/O thread: https://stackoverflow.com/a/65248598

nick-fields added a commit that referenced this issue Aug 4, 2022
nick-fields added a commit that referenced this issue Aug 4, 2022
nick-fields added a commit that referenced this issue Aug 4, 2022
nick-fields added a commit that referenced this issue Aug 4, 2022
@nick-fields
Copy link
Owner

Thanks for this report. I just published v2.8.0 that will resolve this. Thanks to your repo, I was able to reproduce the issue before the fix and then confirm it was fixed after by testing with upto 1000MiB of output.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants