Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot pipe output to S3 with @aws-sdk/lib-storage. #3313

Closed
3 tasks done
BlakeB415 opened this issue Jul 30, 2022 · 15 comments
Closed
3 tasks done

Cannot pipe output to S3 with @aws-sdk/lib-storage. #3313

BlakeB415 opened this issue Jul 30, 2022 · 15 comments

Comments

@BlakeB415
Copy link

BlakeB415 commented Jul 30, 2022

Possible bug

Error: Premature close
    at new NodeError (node:internal/errors:372:5)
    at Sharp.onclose (node:internal/streams/end-of-stream:136:30)
    at Sharp.emit (node:events:527:28)
    at Object.<anonymous> (/usr/src/app/node_modules/sharp/lib/output.js:1144:16) {
  code: 'ERR_STREAM_PREMATURE_CLOSE'
}

Is this a possible bug in a feature of sharp, unrelated to installation?

  • Running npm install sharp completes without error.
  • Running node -e "require('sharp')" completes without error.

Are you using the latest version of sharp?

  • I am using the latest version of sharp as reported by npm view sharp dist-tags.latest.

If you are using another package which depends on a version of sharp that is not the latest, please open an issue against that package instead.

What is the output of running npx envinfo --binaries --system --npmPackages=sharp --npmGlobalPackages=sharp?

  System:
    OS: Windows 10 10.0.22000
    CPU: (12) x64 AMD Ryzen 5 2600 Six-Core Processor
    Memory: 1.19 GB / 15.95 GB
  Binaries:
    Node: 17.9.0 - C:\ProgramData\chocolatey\bin\node.EXE
    Yarn: 1.22.19 - C:\ProgramData\chocolatey\bin\yarn.CMD
    npm: 8.5.5 - C:\ProgramData\chocolatey\bin\npm.CMD
  npmPackages:
    sharp: ^0.30.7 => 0.30.7 

What are the steps to reproduce?

Use AWS NodeJS SDK v3 to stream the image output to a bucket, pipe the output to the Upload instance.

What is the expected behaviour?

It successfully uploads to S3 without error.

Please provide a minimal, standalone code sample, without other dependencies, that demonstrates this problem

This only happens with the AWS SDK v3 + Sharp so it's hard to replicate without it.

import sharp from 'sharp';
import fs from 'fs';
import AWS from '@aws-sdk/client-s3';
import { Upload } from '@aws-sdk/lib-storage';

const resizer = sharp().resize({ fit: sharp.fit.cover, height: 800, width: 800 }).avif();

const upload = new Upload({
	client: new AWS.S3({
		region: 'us-west-000',
		endpoint: 'https://s3.us-west-000.backblazeb2.com',
		credentials: {
			accessKeyId: '[REDACTED]',
			secretAccessKey: '[REDACTED]',
		}
        }),
	params: { Bucket: 'bucket-name', Key: 'test.avif', Body: fs.createReadStream('./input.png').pipe(resizer) }
});

await upload.done();

Please provide sample image(s) that help explain this problem

N/A

@BlakeB415
Copy link
Author

BlakeB415 commented Jul 30, 2022

Removing this.emit('close'); from the finish event on the erroring line in output.js in sharp fixes it.

@lovell lovell added question and removed triage labels Jul 30, 2022
@lovell
Copy link
Owner

lovell commented Jul 30, 2022

Please can you add some error handling here:

fs.createReadStream('./input.png').pipe(resizer)

My best guess would be that something is going wrong on the input side.

@BlakeB415
Copy link
Author

BlakeB415 commented Jul 30, 2022

Please can you add some error handling here:

fs.createReadStream('./input.png').pipe(resizer)

My best guess would be that something is going wrong on the input side.

There doesn't seem to be any error on the input stream. I've also tried input streams from both the fs lib and the fastify-multipart library and it's the same issue.

@lovell
Copy link
Owner

lovell commented Jul 31, 2022

Node: 17.9.0 - C:\ProgramData\chocolatey\bin\node.EXE

This version of Node.js is EOL, please can you upgrade to the latest version 18 release and see if you have the same problem.

@BlakeB415
Copy link
Author

Node: 17.9.0 - C:\ProgramData\chocolatey\bin\node.EXE

This version of Node.js is EOL, please can you upgrade to the latest version 18 release and see if you have the same problem.

This still occurs with v18.7.0

@lovell
Copy link
Owner

lovell commented Aug 1, 2022

Thanks for confirming, I asked as there have been a number of stream-related changes to Node.js throughout versions 17 and 18.

Do you only see this problem on Windows? Are you able to test on e.g. Linux?

Looking at the source of the @aws-sdk/lib-storage package, it appears to use async generators, which is a newer and I suspect less well tested for edge cases approach to reading from a Stream.

https://github.com/aws/aws-sdk-js-v3/blob/main/lib/lib-storage/src/chunks/getChunkStream.ts

The next step is to try to replicate the same problem without @aws-sdk/lib-storage to help determine if this is a bug in sharp, the @aws-sdk/lib-storage package or Node.js itself. I had a quick try but couldn't replicate it locally.

@danalloway
Copy link

danalloway commented Aug 3, 2022

@BlakeB415 we have a similar flow in our app, and we used a PassThrough stream to work around this same issue

import { PassThrough } from "stream";
import { S3Client } from "@aws-sdk/client-s3";
import { Upload } from "@aws-sdk/lib-storage";

function doUpload(data, fileName) {
    const uploadStream = new PassThrough(); // <-------- HERE

    const imageStream = sharp()
        .resize({
          width: 1400,
          height: 1400,
          fit: "inside",
          withoutEnlargement: true,
        })
        .jpeg({
          force: true,
          mozjpeg: true,
          optimizeCoding: true,
          progressive: true,
          quality: 80,
        });

    const s3Client = new S3Client({
        apiVersion: "v4",
        credentials: {
            accessKeyId: [REDACTED],
            secretAccessKey: [REDACTED],
        },
        endpoint: [REDACTED],
        region: "auto",
    });

    const upload = new Upload({
        client: s3Client,
        queueSize: 1,
        params: {
            Bucket: [REDACTED],
            ContentType: "image/jpeg",
            Key: [REDACTED],
            Body: uploadStream, // <-------- HERE
        },
    });

    data.pipe(imageStream).pipe(uploadStream); // <-------- HERE

    const results = await upload.done();

    return results;
}

we're using a newer version of the S3 client, and in the function here data is our file stream in question

also, anecdotally, in general the S3 client behaves strangely with streams, and we've found that our two options are to 1.) pass it the file stream directly, and that's it, or 2.) give it a PassThrough and then use the PassThrough as part of another pipe (as shown in this example)

hope this helps

@lovell
Copy link
Owner

lovell commented Sep 5, 2022

@BlakeB415 Were you able to make any progress with this?

@BlakeB415
Copy link
Author

@BlakeB415 Were you able to make any progress with this?

Removing this line worked for me. https://github.com/lovell/sharp/blob/main/lib/output.js#L1208

@lovell
Copy link
Owner

lovell commented Sep 16, 2022

@BlakeB415 Thanks for the update, please can you try the following change to see if that fixes it also:

--- a/lib/constructor.js
+++ b/lib/constructor.js
@@ -168,7 +168,7 @@ const Sharp = function (input, options) {
   if (!(this instanceof Sharp)) {
     return new Sharp(input, options);
   }
-  stream.Duplex.call(this);
+  stream.Duplex.call(this, { emitClose: false });
   this.options = {
     // resize options
     topOffsetPre: -1,

I think this might relate to nodejs/node#32965 - it looks like we need to notify Node.js that sharp will emit the close event itself.

@BlakeB415
Copy link
Author

@BlakeB415 Thanks for the update, please can you try the following change to see if that fixes it also:

--- a/lib/constructor.js
+++ b/lib/constructor.js
@@ -168,7 +168,7 @@ const Sharp = function (input, options) {
   if (!(this instanceof Sharp)) {
     return new Sharp(input, options);
   }
-  stream.Duplex.call(this);
+  stream.Duplex.call(this, { emitClose: false });
   this.options = {
     // resize options
     topOffsetPre: -1,

I think this might relate to nodejs/node#32965 - it looks like we need to notify Node.js that sharp will emit the close event itself.

I still get the error with this patch.

@lovell
Copy link
Owner

lovell commented Sep 18, 2022

@BlakeB415 Thanks for checking. Here's an alternative approach, which ensures the close event waits for the end event. If you're able to test it with your S3 code that would be great.

-          this.emit('close');
+          this.on('end', () => this.emit('close'));

@BlakeB415
Copy link
Author

@BlakeB415 Thanks for checking. Here's an alternative approach, which ensures the close event waits for the end event. If you're able to test it with your S3 code that would be great.

-          this.emit('close');
+          this.on('end', () => this.emit('close'));

This worked flawlessly!

@lovell
Copy link
Owner

lovell commented Sep 20, 2022

Thanks for confirming, updated via commit 70e6bb0 and this will be in v0.31.1.

@lovell lovell added this to the v0.31.1 milestone Sep 20, 2022
@lovell
Copy link
Owner

lovell commented Sep 29, 2022

v0.31.1 now available

@lovell lovell closed this as completed Sep 29, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants