Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(featureDev): Prevent crash on large repos or unsupported encoding during file collection #4888

Merged
merged 1 commit into from
Apr 30, 2024

Conversation

jsalles
Copy link
Contributor

@jsalles jsalles commented Apr 30, 2024

Problem

There are currently 2 problems in featureDev when doing collection of repository files:

  • Trying to decode binary files might crash the extension host, and
  • too large repos might make the extension run out of available memory.

Solution

This CR fixes both problems by:

  • Adding a maximum allowed repo size of 200mb for collection
  • Changing the behavior of the TextDecoder to replace the unsupported characters instead of throwing TypeError

Users who try to use featureDev on a codebase larger than 200mb will see a message asking for a different workspace to be selected.

License

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

@jpinkney-aws
Copy link
Contributor

Just one more thing, are you able to add a changelog? I'm sure theres some users that would like to know this is fixed 😄

@jsalles jsalles requested a review from a team as a code owner April 30, 2024 13:39
@jsalles
Copy link
Contributor Author

jsalles commented Apr 30, 2024

Just one more thing, are you able to add a changelog? I'm sure theres some users that would like to know this is fixed 😄

Yes, definitely! Just added a changelog with the bugfix :)

@@ -137,9 +153,9 @@ export async function prepareRepoData(
span: Metric<AmazonqCreateUpload>
) {
try {
const files = await collectFiles(repoRootPaths, workspaceFolders, true)
const zip = new AdmZip()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the current approach used to zip files does not "stream" the zip creation so will also run into memory constraints.

please look at #4769 , can you rewrite this feature on top of that (after it lands)?

Copy link
Contributor

@grimalk grimalk Apr 30, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the ETA of this being ready? Presume we want both these in the same wave of the toolkit?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I'm happy to change our implementation to use the stream after it's merged.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the ETA of this being ready?

The zip streaming won't be ready today, but hopefully next week. @ctlai95

@justinmk3 justinmk3 merged commit fa680b5 into aws:master Apr 30, 2024
14 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants