Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multipart upload doesn't throw or reject when max upload parts exceeded #5964

Open
3 tasks done
piurafunk opened this issue Apr 4, 2024 · 1 comment
Open
3 tasks done
Assignees
Labels
bug This issue is a bug. p2 This is a standard priority issue queued This issues is on the AWS team's backlog

Comments

@piurafunk
Copy link

Checkboxes for prior research

Describe the bug

Exceeding Upload.MAX_PARTS doesn't throw an exception completely out

SDK version number

@aws-sdk/client-s3@3.549.0

Which JavaScript Runtime is this issue in?

Node.js

Details of the browser/Node.js/ReactNative version

v20.12.0

Reproduction Steps

import { S3 } from '@aws-sdk/client-s3'
import { Upload } from '@aws-sdk/lib-storage'

const s3 = new S3()
const bucket = 'bucket' // Update this to your bucket name

// The upload must be more than 5242880000 bytes (5 MiB x 10000 parts)
// To pass this via stdin, you can use the following command:
// $ dd if=/dev/urandom bs=5242880 count=11000 | node repro.mjs

// If you have pipeviewer installed, you can use the following command to see the progress:
// $ dd if=/dev/urandom bs=5242880 count=11000 | pv -s $((5242880 * 11000)) | node repro.mjs
const upload = new Upload({
    params: {
        Bucket: bucket,
        Key: 'bigfile.txt',
        Body: process.stdin,
    },
    client: s3,
    queueSize: 50,
})

await upload.done()

Observed Behavior

aws --profile sso-staging s3api get-object-attributes --bucket <bucket> --key bigfile.txt --object-attributes ObjectSize
{
    "LastModified": "2024-04-04T20:46:13+00:00",
    "VersionId": "lAYUcGXinSavYWPxPLsbfTFRWoD_UT5s",
    "ObjectSize": 52428800000
}

The object size is actually exactly 5 MiB x 10,000 parts: 52428800000 bytes. Notably, it also correctly completes the multi-part upload, so it really does look like it completes.

I also tested with await upload.done().catch(err => {throw err}) in case of promise being rejected; it has the same problem.

Expected Behavior

https://github.com/aws/aws-sdk-js-v3/blob/v3.549.0/lib/lib-storage/src/Upload.ts#L195 should throw an exception out when the parts max is exceeded.

Possible Solution

I'm not a node expert, but I suspect it has something to do with throwing an error inside an async function.

Additional Information/Context

No response

@piurafunk piurafunk added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Apr 4, 2024
@piurafunk piurafunk changed the title TITLE FOR BUG REPORT Multipart upload doesn't throw or reject when max upload parts exceeded Apr 4, 2024
@RanVaknin RanVaknin self-assigned this Apr 9, 2024
@RanVaknin
Copy link
Contributor

Hi @piurafunk ,

Thanks for your patience. Im able to reproduce the reported behavior. It's not clear to me at this point why an error is not thrown by the SDK. Will review this and let you know.

Thanks,
Ran~

@RanVaknin RanVaknin added needs-review This issue/pr needs review from an internal developer. p2 This is a standard priority issue investigating Issue is being investigated and/or work is in progress to resolve the issue. and removed needs-triage This issue or PR still needs to be triaged. needs-review This issue/pr needs review from an internal developer. labels Apr 22, 2024
@kuhe kuhe added queued This issues is on the AWS team's backlog and removed investigating Issue is being investigated and/or work is in progress to resolve the issue. labels May 7, 2024
@kuhe kuhe self-assigned this May 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug This issue is a bug. p2 This is a standard priority issue queued This issues is on the AWS team's backlog
Projects
None yet
Development

No branches or pull requests

3 participants