Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug] EMFILE: too many open files #485

Open
konradpabjan opened this issue Dec 19, 2023 · 10 comments · May be fixed by actions/toolkit#1723
Open

[bug] EMFILE: too many open files #485

konradpabjan opened this issue Dec 19, 2023 · 10 comments · May be fixed by actions/toolkit#1723
Labels
bug Something isn't working

Comments

@konradpabjan
Copy link
Collaborator

konradpabjan commented Dec 19, 2023

What happened?

Stumbled on a failure that happened with v4. Apparently this doesn't happen with v3

Image

Example run can be found here: https://github.com/spacetelescope/jdaviz/actions/runs/7266578163/job/19798598137

What did you expect to happen?

Successful artifact upload

How can we reproduce it?

Not sure, but I think the key here is a large number of files. In the logs it shows 8824 files which is quite a bit. Maybe some read stream isn't being closed somewhere 🤔

Anything else we need to know?

Nope

What version of the action are you using?

v4

What are your runner environments?

windows

Are you on GitHub Enterprise Server? If so, what version?

No

@konradpabjan konradpabjan added the bug Something isn't working label Dec 19, 2023
@zaikunzhang
Copy link

Also seen in my workflow.

Run actions/upload-artifact@v4.0.0
With the provided path, there will be [7](https://github.com/primalib/prima/actions/runs/7338125515/job/19983281711#step:4:8)3166 files uploaded
Artifact name is valid!
Root directory input is valid!
Beginning upload of artifact content to blob storage
node:events:492
      throw er; // Unhandled 'error' event
      ^

Error: EMFILE: too many open files, open '/home/runner/work/prima/prima/prima_prima_quadruple.1_20.ubln.single.231227_1211_start/HS63'
Emitted 'error' event on ReadStream instance at:
    at emitErrorNT (node:internal/streams/destroy:151:[8](https://github.com/primalib/prima/actions/runs/7338125515/job/19983281711#step:4:9))
    at emitErrorCloseNT (node:internal/streams/destroy:[11](https://github.com/primalib/prima/actions/runs/7338125515/job/19983281711#step:4:12)6:3)
    at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
  errno: -24,
  code: 'EMFILE',
  syscall: 'open',
  path: '/home/runner/work/prima/prima/prima_prima_quadruple.1_20.ubln.single.23[12](https://github.com/primalib/prima/actions/runs/7338125515/job/19983281711#step:4:13)27_1211_start/HS63'
}

Node.js v[20](https://github.com/primalib/prima/actions/runs/7338125515/job/19983281711#step:4:21).8.1

@robherley
Copy link
Contributor

We can possibly fix with the file method in archiver:

Appends a file given its filepath using a lazystream wrapper to prevent issues with open file limits.

Right now we are just using the append operation and creating a read stream per file:

@SMoraisAnsys
Copy link

Bumping as I'm facing the same issue on a repo :)

SMoraisAnsys added a commit to ansys/pyaedt that referenced this issue Jan 17, 2024
Problem: it seems that using the new version (v4) of upload-artifact
adds some upper bound on the number of files to be uploaded.

Solution: reverting our actions to use v3 and make changes compatible
with other ansys actions, i.e. changing some ansys/action/... to v4

Associated issue : actions/upload-artifact#485
@sungaila
Copy link

sungaila commented Jan 25, 2024

Same issue here when I tried to upload 9,834 files for a single artifact. The compression-level made no difference.

Had to downgrade to upload-artifact@v3 and download-artifact@v3.

SMoraisAnsys added a commit to ansys/pyaedt that referenced this issue Jan 30, 2024
Note: some changes are conflicting with the current full
documentation upload process (cf
actions/upload-artifact#485)

Extra: removing setoutput as its deprecated by github
@qwerttvv
Copy link

qwerttvv commented Feb 2, 2024

actions/upload-artifact@main
  with:
    name: MSYS
    path: C:\MSYS
    if-no-files-found: warn
    compression-level: 6
    overwrite: false
  env:
    MSYSTEM: MINGW64
With the provided path, there will be 51574 files uploaded
Artifact name is valid!
Root directory input is valid!
Beginning upload of artifact content to blob storage
node:events:492
      throw er; // Unhandled 'error' event
      ^

Error: EMFILE: too many open files, open 'C:\MSYS\msys64\mingw64\lib\python3.11\distutils\tests\__pycache__\test_cygwinccompiler.cpython-311.pyc'
Emitted 'error' event on ReadStream instance at:
    at emitErrorNT (node:internal/streams/destroy:151:8)
    at emitErrorCloseNT (node:internal/streams/destroy:116:3)
    at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
  errno: -4066,
  code: 'EMFILE',
  syscall: 'open',
  path: 'C:\\MSYS\\msys64\\mingw64\\lib\\python3.11\\distutils\\tests\\__pycache__\\test_cygwinccompiler.cpython-311.pyc'
}

Node.js v20.8.1

same bug at latest https://github.com/actions/upload-artifact/tree/3a8048248f2f288c271830f8ecf2a1c5d8eb0e9a

actions/upload-artifact@v3 is good but warning

Warning: There are over 10,000 files in this artifact, consider creating an archive before upload to improve the upload performance.

rjanvier added a commit to 3DFin/3DFin that referenced this issue Feb 2, 2024
mnixry added a commit to mnixry/binutils-wasm that referenced this issue Feb 4, 2024
@zaikunzhang
Copy link

Is this fixed yet?

@ehuelsmann
Copy link

rjanvier added a commit to 3DFin/3DFin that referenced this issue Mar 15, 2024
rjanvier added a commit to 3DFin/3DFin that referenced this issue Mar 16, 2024
* Add a CD system for 3DFin CC Plugin

* Improve build

- fix tests
- improve build times by removing CCViewer
- add upload artifact

* Improve ci build

- Use current 3DFin
- remove duplicated install

* downgrade to upload-artifact v3

- v4 is bugged actions/upload-artifact#485

* Update other build actions

* lower the compression requierment to speed up build

* Use pip and install-qt-action vs. conda

Use venv in CI

Revert the use of venv

python config

...

* Revert to upload artifactv3

zip the archive

* laszip building
@pcfreak30
Copy link

Ditto, just ran into the same problem :/

@rmunn
Copy link

rmunn commented Apr 29, 2024

I've switched actions/toolkit#1723 (proposed fix for retaining file permissions in uploaded .zip files) to use zip.file(file.sourcePath) instead of zip.append(createReadStream(file.sourcePath)) so that that PR can also fix this bug at the same time. It needs a review from repo maintainers before it can go anywhere, though.

@marcodali
Copy link

Is there a command that we can run in the docker container that allow us to increment the number of open file descriptors to 10k lets say for example??

Wouldn't that fix this issue?

CoderDen732 added a commit to CoderDen732/zed that referenced this issue May 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

10 participants