Description
Describe the bug
If a command has too much output, then even though the command might fail, the step gets marked as passing.
I measured "too much" as ≥1026 KiB, but I wouldn't be surprised if it's timing related and varies at any value much over 1024.
I've created a reproducer over at https://github.com/LukeShu/gha-test where each step always()
runs and dumps an increasing amount of output to stdout and then exits with a failure. As seen in the screenshot below, everything <=1MiB correctly fails, but then everything >=1⅛MiB starts passing.
This can be replicated locally, as well:
(All you need to know about my gha-test
reproducer is that make -C ../gha-test bytes-1152
spits out 1152KiB of <=80-character lines on stdout, and then exits with code 1.)
$ INPUT_CONTINUE_ON_ERROR=false INPUT_MAX_ATTEMPTS=1 INPUT_TIMEOUT_MINUTES=5 INPUT_COMMAND='make -C ../gha-test bytes-1152' node ./dist/index.js
…
9: 648 aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
a: 729 aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
b: 810 aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
c: 891 aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
d: 972 aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
::debug::Code: null
::debug::Signal: SIGPIPE
Command completed after 1 attempt(s).
::set-output name=exit_code::0
I originally encountered this with v2.4.0, but the local-reproduction above is with the latest version (v2.7.0).
Expected behavior
I expected that if my tests fail, then my CI will fail.
Logs
Enable debug logging then attach the raw logs (specifically the raw output of this action).
Activity
[-]Marks failing commands as having succeeded if they have too much output[/-][+]Marks failing commands as having succeeded if they have too much output (SIGPIPE)[/+].github: Don't retry tests
.github: Don't retry tests
tstackhouse commentedon Aug 3, 2022
Doing some research on this, it looks like this could be connected with the
maxBuffer
option ofchild_process.exec()
, which defaults to1024 * 1024
bytes, see the NodeJS docs: https://nodejs.org/api/child_process.html#child_processexeccommand-options-callbackDoing a bit more research, is there a reason to not use
child_process.spawn
which streams output to the parent process rather than buffering it? See this S/O thread: https://stackoverflow.com/a/65248598test: try to repro #76
test: try to repro #76
test: try to repro #76
test: try to repro #76
nick-fields commentedon Aug 4, 2022
Thanks for this report. I just published v2.8.0 that will resolve this. Thanks to your repo, I was able to reproduce the issue before the fix and then confirm it was fixed after by testing with upto 1000MiB of output.