Skip to content

Commit

Permalink
breaking: Remove dependency on streams polyfill (#149)
Browse files Browse the repository at this point in the history
* Remove dependency on streams polyfill

* require node v16.7

* change test version

* only test ubuntu

* remove changelog
  • Loading branch information
jimmywarting committed May 18, 2023
1 parent 57e4dae commit a1a182e
Show file tree
Hide file tree
Showing 12 changed files with 65 additions and 231 deletions.
3 changes: 1 addition & 2 deletions .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,8 @@

-------------------------------------------------------------------------------------------------

<!-- Mark what you have done, Remove unnecessary ones. Add new tasks that may fit (like TODO's) -->
<!-- Mark what you have done with [x], Remove unnecessary ones. Add new tasks that may fit (like TODO's) -->
- [ ] I prefixed the PR-title with `docs: `, `fix(area): `, `feat(area): ` or `breaking(area): `
- [ ] I updated ./CHANGELOG.md with a link to this PR or Issue
- [ ] I updated the README.md
- [ ] I Added unit test(s)

Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,16 +13,16 @@ jobs:
test:
strategy:
matrix:
os: [ubuntu-latest, windows-latest, macOS-latest]
node: ["17.3"]
os: [ubuntu-latest]
node: ["16", "18", "20"]

runs-on: ${{ matrix.os }}

steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v2
with:
node-version: '17.3'
node-version: ${{ matrix.node }}
- run: npm install
- run: npm test
- run: npm run report -- --colors
Expand Down
98 changes: 0 additions & 98 deletions CHANGELOG.md

This file was deleted.

15 changes: 5 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,6 @@ npm install fetch-blob
- CommonJS was replaced with ESM
- The node stream returned by calling `blob.stream()` was replaced with whatwg streams
- (Read "Differences from other blobs" for more info.)

</details>

<details>
Expand All @@ -48,14 +47,10 @@ npm install fetch-blob

```js
// Ways to import
// (PS it's dependency free ESM package so regular http-import from CDN works too)
import Blob from 'fetch-blob'
import File from 'fetch-blob/file.js'

import {Blob} from 'fetch-blob'
import {File} from 'fetch-blob/file.js'
import { Blob } from 'fetch-blob'
import { File } from 'fetch-blob/file.js'

const {Blob} = await import('fetch-blob')
const { Blob } = await import('fetch-blob')


// Ways to read the blob:
Expand All @@ -75,7 +70,6 @@ It will not read the content into memory. It will only stat the file for last mo

```js
// The default export is sync and use fs.stat to retrieve size & last modified as a blob
import blobFromSync from 'fetch-blob/from.js'
import {File, Blob, blobFrom, blobFromSync, fileFrom, fileFromSync} from 'fetch-blob/from.js'

const fsFile = fileFromSync('./2-GiB-file.bin', 'application/octet-stream')
Expand Down Expand Up @@ -119,7 +113,8 @@ blob = undefined // loosing references will delete the file from disk

### Creating Blobs backed up by other async sources
Our Blob & File class are more generic then any other polyfills in the way that it can accept any blob look-a-like item
An example of this is that our blob implementation can be constructed with parts coming from [BlobDataItem](https://github.com/node-fetch/fetch-blob/blob/8ef89adad40d255a3bbd55cf38b88597c1cd5480/from.js#L32) (aka a filepath) or from [buffer.Blob](https://nodejs.org/api/buffer.html#buffer_new_buffer_blob_sources_options), It dose not have to implement all the methods - just enough that it can be read/understood by our Blob implementation. The minium requirements is that it has `Symbol.toStringTag`, `size`, `slice()` and either a `stream()` or a `arrayBuffer()` method. If you then wrap it in our Blob or File `new Blob([blobDataItem])` then you get all of the other methods that should be implemented in a blob or file
An example of this is that our blob implementation can be constructed with parts coming from [BlobDataItem](https://github.com/node-fetch/fetch-blob/blob/8ef89adad40d255a3bbd55cf38b88597c1cd5480/from.js#L32) (aka a filepath) or from [buffer.Blob](https://nodejs.org/api/buffer.html#buffer_new_buffer_blob_sources_options), It dose not have to implement all the methods - just enough that it can be read/understood by our Blob implementation. The minium requirements is that it has `Symbol.toStringTag`, `size`, `slice()`, `stream()` methods (the stream method
can be as simple as being a sync or async iterator that yields Uint8Arrays. If you then wrap it in our Blob or File `new Blob([blobDataItem])` then you get all of the other methods that should be implemented in a blob or file (aka: text(), arrayBuffer() and type and a ReadableStream)

An example of this could be to create a file or blob like item coming from a remote HTTP request. Or from a DataBase

Expand Down
3 changes: 1 addition & 2 deletions file.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import Blob from './index.js'
import { Blob } from './index.js'

const _File = class File extends Blob {
#lastModified = 0
Expand Down Expand Up @@ -46,4 +46,3 @@ const _File = class File extends Blob {

/** @type {typeof globalThis.File} */// @ts-ignore
export const File = _File
export default File
4 changes: 2 additions & 2 deletions from.js
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@ import { tmpdir } from 'node:os'
import process from 'node:process'
import DOMException from 'node-domexception'

import File from './file.js'
import Blob from './index.js'
import { File } from './file.js'
import { Blob } from './index.js'

const { stat, mkdtemp } = fs
let i = 0, tempDir, registry
Expand Down
50 changes: 25 additions & 25 deletions index.js
Original file line number Diff line number Diff line change
@@ -1,19 +1,32 @@
/*! fetch-blob. MIT License. Jimmy Wärting <https://jimmy.warting.se/opensource> */

// TODO (jimmywarting): in the feature use conditional loading with top level await (requires 14.x)
// Node has recently added whatwg stream into core

import './streams.cjs'
if (!globalThis.ReadableStream) {
try {
const process = await import('node:process').then(m => m.default)
const { emitWarning } = process
try {
process.emitWarning = () => {}
const streams = await import('node:stream/web').then(m => m.default)
Object.assign(globalThis, streams)
process.emitWarning = emitWarning
} catch (error) {
process.emitWarning = emitWarning
throw error
}
} catch (error) {}
}

// 64 KiB (same size chrome slice theirs blob into Uint8array's)
const POOL_SIZE = 65536

/** @param {(Blob | Uint8Array)[]} parts */
async function * toIterator (parts, clone = true) {
/**
* @param {(Blob | Uint8Array)[]} parts
* @param {boolean} clone
* @returns {AsyncIterableIterator<Uint8Array>}
*/
async function * toIterator (parts, clone) {
for (const part of parts) {
if ('stream' in part) {
yield * (/** @type {AsyncIterableIterator<Uint8Array>} */ (part.stream()))
} else if (ArrayBuffer.isView(part)) {
if (ArrayBuffer.isView(part)) {
if (clone) {
let position = part.byteOffset
const end = part.byteOffset + part.byteLength
Expand All @@ -26,16 +39,9 @@ async function * toIterator (parts, clone = true) {
} else {
yield part
}
/* c8 ignore next 10 */
} else {
// For blobs that have arrayBuffer but no stream method (nodes buffer.Blob)
let position = 0, b = (/** @type {Blob} */ (part))
while (position !== b.size) {
const chunk = b.slice(position, Math.min(b.size, position + POOL_SIZE))
const buffer = await chunk.arrayBuffer()
position += buffer.byteLength
yield new Uint8Array(buffer)
}
// @ts-ignore TS Think blob.stream() returns a node:stream
yield * part.stream()
}
}
}
Expand Down Expand Up @@ -139,11 +145,6 @@ const _Blob = class Blob {
* @return {Promise<ArrayBuffer>}
*/
async arrayBuffer () {
// Easier way... Just a unnecessary overhead
// const view = new Uint8Array(this.size);
// await this.stream().getReader({mode: 'byob'}).read(view);
// return view.buffer;

const data = new Uint8Array(this.size)
let offset = 0
for await (const chunk of toIterator(this.#parts, false)) {
Expand Down Expand Up @@ -218,7 +219,7 @@ const _Blob = class Blob {
}
}

const blob = new Blob([], { type: String(type).toLowerCase() })
const blob = new Blob([], { type: `${type}` })
blob.#size = span
blob.#parts = blobParts

Expand Down Expand Up @@ -251,4 +252,3 @@ Object.defineProperties(_Blob.prototype, {

/** @type {typeof globalThis.Blob} */
export const Blob = _Blob
export default Blob
16 changes: 7 additions & 9 deletions package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "fetch-blob",
"version": "3.1.5",
"version": "4.0.0",
"description": "Blob & File implementation in Node.js, originally from node-fetch.",
"main": "index.js",
"type": "module",
Expand All @@ -10,8 +10,7 @@
"file.d.ts",
"index.js",
"index.d.ts",
"from.d.ts",
"streams.cjs"
"from.d.ts"
],
"scripts": {
"test": "node --experimental-loader ./test/http-loader.js ./test/test-wpt-in-node.js",
Expand All @@ -26,7 +25,7 @@
"node-fetch"
],
"engines": {
"node": "^12.20 || >= 14.13"
"node": ">=16.7"
},
"author": "Jimmy Wärting <jimmy@warting.se> (https://jimmy.warting.se)",
"license": "MIT",
Expand All @@ -35,9 +34,9 @@
},
"homepage": "https://github.com/node-fetch/fetch-blob#readme",
"devDependencies": {
"@types/node": "^18.0.2",
"c8": "^7.11.0",
"typescript": "^4.5.4"
"@types/node": "^16.5.0",
"c8": "^7.13.0",
"typescript": "^5.0.4"
},
"funding": [
{
Expand All @@ -50,7 +49,6 @@
}
],
"dependencies": {
"node-domexception": "^1.0.0",
"web-streams-polyfill": "^3.0.3"
"node-domexception": "^1.0.0"
}
}
51 changes: 0 additions & 51 deletions streams.cjs

This file was deleted.

0 comments on commit a1a182e

Please sign in to comment.