New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bump squoosh to the latest version #29506
Conversation
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
) => { | ||
const simplePng = pngEncDec.encode( | ||
new Uint8Array(buffer), | ||
width, | ||
height | ||
) | ||
const imageData = oxipng.optimise(simplePng, opts.level) | ||
oxipng.cleanup() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is generally OK I think (3. in #23565). Removing it will not leak more memory, but it won't clean up the allocated memory for the finished task. When a new task arrives, the memory will still get released and new memory is allocated for the new task. So the behavior is basically:
- Current canary: new task (100MB↑), GC (100MB↓), new task (100MB↑), GC (100MB↓), ...
- With this PR: new task (100MB↑), new task (unchanged), new task (unchanged), ...
For long-running servers, it's definitely ideal to keep this logic though (or fix it in the upstream lib).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When I applied the patch that set wasm = null
, it caused null reference errors during next build
:
$ yarn next build examples/image-component
info - Checking validity of types
info - Creating an optimized production build
Failed to compile.
./public/vercel.png
~/next.js/packages/next/server/lib/squoosh/resize/squoosh_resize.js:91
wasm.__wbindgen_add_to_stack_pointer(16)
^
TypeError: Cannot read property '__wbindgen_add_to_stack_pointer' of null
https://github.com/vercel/next.js/runs/3750453542#step:7:1154
Oddly enough, the next dev
tests passed 🤔
Maybe I can try the lazy loading suggestion in the other comment to see if that works 🕵️
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems I needed both the cleanup()
calls as well as the lazy init()
calls (c3bfac4).
Now its working 👍
Also, why did permissions on the wasm files change? Any specific reason? |
This comment has been minimized.
This comment has been minimized.
if (process['argv'].length > 1) { | ||
thisProgram = process['argv'][1].replace(/\\/g, '/') | ||
} | ||
arguments_ = process['argv'].slice(2) | ||
quit_ = function (status) { | ||
process['exit'](status) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Glad that they've removed the process.on('uncaughtException')
event handlers. 👍
const pngEncDecInit = () => | ||
pngEncDec.default(fsp.readFile(pathify(pngEncDecWasm))) | ||
const pngEncDecPromise = pngEncDec.default(fsp.readFile(pathify(pngEncDecWasm))) | ||
|
||
// OxiPNG | ||
// @ts-ignore | ||
import * as oxipng from './png/squoosh_oxipng.js' | ||
const oxipngWasm = path.resolve(__dirname, './png/squoosh_oxipng_bg.wasm') | ||
const oxipngInit = () => oxipng.default(fsp.readFile(pathify(oxipngWasm))) | ||
const oxipngPromise = oxipng.default(fsp.readFile(pathify(oxipngWasm))) | ||
|
||
// Resize | ||
// @ts-ignore | ||
import * as resize from './resize/squoosh_resize.js' | ||
const resizeWasm = path.resolve(__dirname, './resize/squoosh_resize_bg.wasm') | ||
const resizeInit = () => resize.default(fsp.readFile(pathify(resizeWasm))) | ||
const resizePromise = resize.default(fsp.readFile(pathify(resizeWasm))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My concern here is all those WASM modules will always be initialized immediately. Previously with () => resize.default(fsp.readFile(pathify(resizeWasm)))
, they were only initialized lazily.
For example if the resize processor isn't used, squoosh_resize_bg.wasm
will still be loaded in this worker. Since the workers are keyed by the task it handles:
next.js/packages/next/server/lib/squoosh/main.ts
Lines 16 to 25 in 92c2dd4
const getWorker = execOnce( | |
() => | |
new Worker(path.resolve(__dirname, 'impl'), { | |
enableWorkerThreads: true, | |
// There will be at most 6 workers needed since each worker will take | |
// at least 1 operation type. | |
numWorkers: Math.max(1, Math.min(cpus().length - 1, 6)), | |
computeWorkerKey: (method) => method, | |
}) | |
) |
Previously, it ensures that 1 worker only loads and handles 1 type of task, for example:
return Buffer.from(await worker.encodeJpeg(imageData, { quality })) |
This worker has only loaded the encodeJpeg
related WASM module inside. And tasks of encodeJpeg
will always be assigned to this same worker. That ensures that we don't allocate the memory for every WASM module inside every worker.
) => { | ||
const simplePng = pngEncDec.encode( | ||
new Uint8Array(buffer), | ||
width, | ||
height | ||
) | ||
const imageData = oxipng.optimise(simplePng, opts.level) | ||
oxipng.cleanup() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is generally OK I think (3. in #23565). Removing it will not leak more memory, but it won't clean up the allocated memory for the finished task. When a new task arrives, the memory will still get released and new memory is allocated for the new task. So the behavior is basically:
- Current canary: new task (100MB↑), GC (100MB↓), new task (100MB↑), GC (100MB↓), ...
- With this PR: new task (100MB↑), new task (unchanged), new task (unchanged), ...
For long-running servers, it's definitely ideal to keep this logic though (or fix it in the upstream lib).
We support Node 12+ which already has the global TextDecoder
This comment has been minimized.
This comment has been minimized.
@Timer This PR just copies the wasm files as is. Since they're executable in the For example, see mozjpeg_node_dec.wasm That being said, I don't think wasm files need to be executable since they don't run standalone. |
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
Stats from current PRDefault Build (Increase detected
|
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
buildDuration | 14.7s | 14.8s | |
buildDurationCached | 3.8s | 3.6s | -243ms |
nodeModulesSize | 193 MB | 197 MB |
Page Load Tests Overall decrease ⚠️
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
/ failed reqs | 0 | 0 | ✓ |
/ total time (seconds) | 3.255 | 3.176 | -0.08 |
/ avg req/sec | 767.94 | 787.15 | +19.21 |
/error-in-render failed reqs | 0 | 0 | ✓ |
/error-in-render total time (seconds) | 1.472 | 1.496 | |
/error-in-render avg req/sec | 1698.29 | 1671.06 |
Client Bundles (main, webpack, commons)
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
779.HASH.js gzip | 179 B | 179 B | ✓ |
framework-HASH.js gzip | 42.2 kB | 42.2 kB | ✓ |
main-HASH.js gzip | 26.9 kB | 26.9 kB | ✓ |
webpack-HASH.js gzip | 1.45 kB | 1.45 kB | ✓ |
Overall change | 70.8 kB | 70.8 kB | ✓ |
Legacy Client Bundles (polyfills)
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
polyfills-a4..dd70.js gzip | 31 kB | 31 kB | ✓ |
Overall change | 31 kB | 31 kB | ✓ |
Client Pages Overall decrease ✓
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
_app-HASH.js gzip | 977 B | 977 B | ✓ |
_error-HASH.js gzip | 194 B | 194 B | ✓ |
amp-HASH.js gzip | 311 B | 311 B | ✓ |
css-HASH.js gzip | 328 B | 328 B | ✓ |
dynamic-HASH.js gzip | 2.67 kB | 2.67 kB | ✓ |
head-HASH.js gzip | 351 B | 351 B | ✓ |
hooks-HASH.js gzip | 918 B | 918 B | ✓ |
image-HASH.js gzip | 4.15 kB | 4.12 kB | -27 B |
index-HASH.js gzip | 260 B | 260 B | ✓ |
link-HASH.js gzip | 1.66 kB | 1.66 kB | ✓ |
routerDirect..HASH.js gzip | 320 B | 320 B | ✓ |
script-HASH.js gzip | 386 B | 386 B | ✓ |
withRouter-HASH.js gzip | 319 B | 319 B | ✓ |
bb14e60e810b..30f.css gzip | 125 B | 125 B | ✓ |
Overall change | 13 kB | 12.9 kB | -27 B |
Client Build Manifests
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
_buildManifest.js gzip | 494 B | 494 B | ✓ |
Overall change | 494 B | 494 B | ✓ |
Rendered Page Sizes
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
index.html gzip | 539 B | 539 B | ✓ |
link.html gzip | 552 B | 552 B | ✓ |
withRouter.html gzip | 534 B | 534 B | ✓ |
Overall change | 1.63 kB | 1.63 kB | ✓ |
Diffs
Diff for _buildManifest.js
@@ -17,7 +17,7 @@ self.__BUILD_MANIFEST = {
"static\u002Fchunks\u002Fpages\u002Fhooks-b247d7d93b59ee105030.js"
],
"/image": [
- "static\u002Fchunks\u002Fpages\u002Fimage-ac34fa70dbc1572d05a7.js"
+ "static\u002Fchunks\u002Fpages\u002Fimage-5e12b7aa334834dd607c.js"
],
"/link": ["static\u002Fchunks\u002Fpages\u002Flink-6c203ee999e47493e19d.js"],
"/routerDirect": [
Diff for image-HASH.js
@@ -970,7 +970,7 @@
height: 1347,
width: 1626,
blurDataURL:
- "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAgAAAAHCAQAAACfUMTVAAAAP0lEQVR42jXIQQqDUBTF0Lf/rdZZB9W2/0REDNwbyATxaJSuCZlafn3tDmHq49Vm87ZkpKPd8n9CF+7XFOImJ8SebtyxyeM6AAAAAElFTkSuQmCC"
+ "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAgAAAAHCAAAAAAQMlOCAAAAMElEQVR42hXJOQ4AIBDDwPz/q9BRcG6MMpUlCweWCSPqnrWxmK33Ucjeq16CyHKYD8vhNzo5Br4tAAAAAElFTkSuQmCC"
};
// EXTERNAL MODULE: ./node_modules/react/jsx-runtime.js
var jsx_runtime = __webpack_require__(5893); // CONCATENATED MODULE: ./pages/image.js
Default Build with SWC (Increase detected ⚠️ )
General Overall increase ⚠️
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
buildDuration | 7.6s | 7.7s | |
buildDurationCached | 3.7s | 3.8s | |
nodeModulesSize | 193 MB | 197 MB |
Page Load Tests Overall increase ✓
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
/ failed reqs | 0 | 0 | ✓ |
/ total time (seconds) | 3.262 | 3.152 | -0.11 |
/ avg req/sec | 766.36 | 793.25 | +26.89 |
/error-in-render failed reqs | 0 | 0 | ✓ |
/error-in-render total time (seconds) | 1.613 | 1.519 | -0.09 |
/error-in-render avg req/sec | 1549.76 | 1645.34 | +95.58 |
Client Bundles (main, webpack, commons)
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
675-HASH.js gzip | 13.8 kB | 13.8 kB | ✓ |
770.HASH.js gzip | 178 B | 178 B | ✓ |
framework-HASH.js gzip | 50.7 kB | 50.7 kB | ✓ |
main-HASH.js gzip | 34.9 kB | 34.9 kB | ✓ |
webpack-HASH.js gzip | 1.64 kB | 1.64 kB | ✓ |
Overall change | 101 kB | 101 kB | ✓ |
Legacy Client Bundles (polyfills)
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
polyfills-a4..dd70.js gzip | 31 kB | 31 kB | ✓ |
Overall change | 31 kB | 31 kB | ✓ |
Client Pages Overall decrease ✓
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
_app-HASH.js gzip | 1.33 kB | 1.33 kB | ✓ |
_error-HASH.js gzip | 180 B | 180 B | ✓ |
amp-HASH.js gzip | 315 B | 315 B | ✓ |
css-HASH.js gzip | 331 B | 331 B | ✓ |
dynamic-HASH.js gzip | 2.79 kB | 2.79 kB | ✓ |
head-HASH.js gzip | 356 B | 356 B | ✓ |
hooks-HASH.js gzip | 637 B | 637 B | ✓ |
image-HASH.js gzip | 573 B | 557 B | -16 B |
index-HASH.js gzip | 262 B | 262 B | ✓ |
link-HASH.js gzip | 2.2 kB | 2.2 kB | ✓ |
routerDirect..HASH.js gzip | 326 B | 326 B | ✓ |
script-HASH.js gzip | 393 B | 393 B | ✓ |
withRouter-HASH.js gzip | 322 B | 322 B | ✓ |
bb14e60e810b..30f.css gzip | 125 B | 125 B | ✓ |
Overall change | 10.1 kB | 10.1 kB | -16 B |
Client Build Manifests
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
_buildManifest.js gzip | 511 B | 511 B | ✓ |
Overall change | 511 B | 511 B | ✓ |
Rendered Page Sizes
vercel/next.js canary | vercel/next.js bump-squoosh-2021-09 | Change | |
---|---|---|---|
index.html gzip | 538 B | 538 B | ✓ |
link.html gzip | 550 B | 550 B | ✓ |
withRouter.html gzip | 532 B | 532 B | ✓ |
Overall change | 1.62 kB | 1.62 kB | ✓ |
Diffs
Diff for _buildManifest.js
@@ -18,7 +18,7 @@ self.__BUILD_MANIFEST = {
],
"/image": [
"static\u002Fchunks\u002F675-e36478c6435f19ec761d.js",
- "static\u002Fchunks\u002Fpages\u002Fimage-ffd5f0bef0d7723148e9.js"
+ "static\u002Fchunks\u002Fpages\u002Fimage-54733aeb9002fa108f3f.js"
],
"/link": ["static\u002Fchunks\u002Fpages\u002Flink-7f8f722ace4fedf0a7aa.js"],
"/routerDirect": [
Diff for image-HASH.js
@@ -45,7 +45,7 @@
height: 1347,
width: 1626,
blurDataURL:
- "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAgAAAAHCAQAAACfUMTVAAAAP0lEQVR42jXIQQqDUBTF0Lf/rdZZB9W2/0REDNwbyATxaJSuCZlafn3tDmHq49Vm87ZkpKPd8n9CF+7XFOImJ8SebtyxyeM6AAAAAElFTkSuQmCC"
+ "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAgAAAAHCAAAAAAQMlOCAAAAMElEQVR42hXJOQ4AIBDDwPz/q9BRcG6MMpUlCweWCSPqnrWxmK33Ucjeq16CyHKYD8vhNzo5Br4tAAAAAElFTkSuQmCC"
}; // CONCATENATED MODULE: ./pages/image.js
function ImagePage(props) {
return /*#__PURE__*/ (0, jsx_runtime.jsxs)(jsx_runtime.Fragment, {
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
Bump
squoosh
to the latest version, currently commit cad09160.Ideally, we would use the version published to npm but it hasn't been published in two months and we have a patch (#23565) that isn't available upstream.
This also is a precursor to getting support for AVIF.