Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementing a mutable append only blob type? #99

Open
jimmywarting opened this issue Jun 16, 2021 · 3 comments
Open

Implementing a mutable append only blob type? #99

jimmywarting opened this issue Jun 16, 2021 · 3 comments

Comments

@jimmywarting
Copy link
Contributor

in this video https://youtu.be/6EDaayYnw6M?t=1202 he talks about returning a blob from the fetch api.
in theory you can return a blob early if you know the content-length or size of the blob. The content dose not have to be known immediately.
you could for example make a request to a 4gb large file and have the blob returned just right after you get the http response without having all data at hand. That's it to say: the response has a content-length and isn't compressed.

This idea was brought up way long before in NodeJS by jasnell about 4y ago

For Blob in general, it is really nothing more than a persistent allocated chunk of memory. It would be possible to create a Blob from one or more TypedArray objects. I'm sketching out additional APIs for the http and http2 modules that would allow a response to draw data from a Blob rather than through the Streams API. There is already something analogous in the http2 implementation in the form of the respondWithFile() and respondWithFD() APIs in the http2 side. Basically, the idea would be to prepare chunks of allocated memory at the native layer, with data that never passes into the JS layer (unless absolutely necessary to do so), then use those to source the data for responses. In early benchmarking this yields a massive boost in throughput without the usual backpressure control issues.

I'm still interested in this idea also, but i have no ide/clue of how to sketch this up or how to best implment it.

I mean i built this HTTP File-like class that operates on byte-range, partial request and having a known content-length
The goal of it all was to have a zip from a remote source, passing it to a zip parser that could slice and read the central directory so it could retrieve a list of all the files that was included and jump/seek within the blob for the stuff you needed without having to download the hole zip file. This meant that it would make multiple partial http request for each file later on
it's a pretty cool concept of optimizing

@jimmywarting
Copy link
Contributor Author

#137 brought a half solution to this problem
it made it easy to turn a stream into a blob/file without copying and require it to be immutable, the data is appended to a file instead and once it is done then it will create blob

@jimmywarting
Copy link
Contributor Author

Just thought of a way to make the data transferable without having to copy the data in order to create a blob.

const uint8 = new Uint8Array([97]) // will get detached and no longer be any usable.
const clone = new Uint8Array(uint8.buffer.transfer(), uint8.byteOffset, uint8.byteLength)
// Olè! We have a immutable uint8 array that we can make a blob out of.
// and they can't mutate the data any longer

it's a way safer option than letting ppl to be able to mutate the data, which they are likely not going to do anyway.
I would like to have a transfer option much like how postMessage dose.

new Blob([ uint8 ], { ...opts }, [ uint8.buffer ])
new Blob([ uint8 ], { 
  ...opts,
  transfer: [ uint8.buffer ]
})

@52HzNoBrain
Copy link

439bcb3e-582b-447a-82a6-91799dfd36cd
fetch-blob : version 3.2.0
q: Element implicitly has an 'any' type because type 'typeof globalThis' has no index signature.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants