Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support stream response in browser #479

Closed
olalonde opened this issue Oct 12, 2016 · 49 comments
Closed

Support stream response in browser #479

olalonde opened this issue Oct 12, 2016 · 49 comments
Projects

Comments

@olalonde
Copy link

Would be useful if responseType could be set to stream in the browser. Right now, it only works in node.

@nickuraltsev
Copy link
Member

Are you suggesting to add support for this? Another reason to create a fetch adapter (#484) as it's not possible to implement this using XHR.

@olalonde
Copy link
Author

olalonde commented Oct 14, 2016

I was more hoping for a Node.js readable stream but I guess a whatwg stream would be better than nothing :) I ended up using https://github.com/kumavis/xhr-stream but yeah, it's a hacky implementation.

@nickuraltsev
Copy link
Member

XMLHttpRequest does not support response body streaming, so I don't really understand the point of wrapping an XMLHttpRequest response into a stream. (fetch standard does support streaming so adding support for fetch response streams makes sense to me.)

Could you please elaborate on the problem you are solving using xhr-stream?

@olalonde
Copy link
Author

olalonde commented Oct 17, 2016

I have a long lived http response that emits new line delimited JSON objects, and want to start display the result in the browser without waiting for the response to finish. xhr-stream seems to work although I'm not sure how it works. I am piping the xhr stream to https://github.com/jahewson/node-byline which is why a Node.js compatible read stream would have been helpful. I'm sure there are probably whatwg stream to node.js stream wrappers somewhere though.

@mividtim
Copy link

Why was this closed? Actually, XHR does support chunked reading, through readyState 3 rather than 4.

@mividtim
Copy link

@mividtim
Copy link

You closed 505 as a duplicate of this issue, and then closed this issue. Neither has been resolved. This issue should be re-opened.

@rizrmd
Copy link

rizrmd commented Jul 12, 2018

any news about this ?

@chriscoderdr
Copy link

What about react-native?

@affanshahid
Copy link

Any news on this?

@vedadeepta
Copy link

any progress on this

@gwh-cpnet
Copy link

gwh-cpnet commented Apr 25, 2019

Would anybody provide a walkaround for it?

I think this issue is still open because no native stream is supported in browser. If we do not want to implement socket.io, maybe SSE is the only alternative left.

@jcjolley
Copy link

jcjolley commented May 30, 2019

@gwh-cpnet
I'm stuck in IE11, so I'm unable to use the fetch API. My 'walkaround' is below. The promise is only there so I can know when it finishes. The callback does the actual work of handling the data from the stream

export const ie11XHRStreamHandler = async (url: string, callback: any) => {
  const xhr = new XMLHttpRequest();
  xhr.open('GET', url, true);
  xhr.responseType = 'stream' as any; // IE11 specific. This prevents msCaching
  let seenChars = 0;
  const p = new Promise((resolve, reject) => {
    xhr.onreadystatechange = () => {
      if (xhr.readyState === xhr.LOADING) {
        callback(xhr.response.substr(seenChars));
        seenChars = xhr.responseText.length;
      } else if (xhr.readyState === xhr.DONE) {
        if (xhr.status === 200) {
          resolve();
        } else {
          reject(`XHR Failed with status ${xhr.status}: ${xhr.statusText}`);
        }
      }
    };
  });
  xhr.send();
  return p;
}

@vizcay
Copy link

vizcay commented Sep 23, 2019

Ended up using https://github.com/eBay/jsonpipe, maybe someone find this useful.

@xintaoLi
Copy link

xintaoLi commented Oct 26, 2020

try this:
1、set responseType: 'blob'
2、read response as text:
response.text() or read as response.stream() or response.arrayBuffer()

3、parse text to json or replace with regular

axios.post('url', params, {
                cancelToken: this.source.token,
                responseType: 'blob'
             })
          .then(response => {
               response.text().then(text => {
               const reg = /:\s*(\d{15,25})(\s*\}|,?)/
               const textdata = text.replace(reg, ':"$1"$2')
               let res = JSON.parse(textdata)
     })
})

@leohxj
Copy link

leohxj commented Nov 27, 2020

So what is the right type of response data when responseType set 'stream'?

@shresthapradip
Copy link

Any update on this?

@jasonsaayman jasonsaayman added this to To do in v1.0.0 via automation May 14, 2021
@1isten
Copy link

1isten commented May 18, 2021

Using blob as a workaround is meaningless as the entire response data is already loaded into memory.

The advantage of stream is that we can load the response chunk by chunk (hence reduce memory usage) and we can start to process the response as soon as possible.

@blackbing
Copy link

I can't believe axios doesn't support fetch stream in browser.

@blackbing
Copy link

@fengerzh fetch function can achieve this. here is an example

https://gist.github.com/blackbing/22ed6db703727fb050a071ce2911684f#file-fetchstream-js

@mividtim
Copy link

mividtim commented May 4, 2023

There is an issue with fetch, however... It doesn't support streaming uploads.

@jetjokers24895
Copy link

I got the same issue when using Axios with Electron. I found a solution to this. With Axios 1.4.0. I config adapter

axios({
URL: 'abc.com',
...
adapter: 'http'
})

@3rd
Copy link

3rd commented Jun 11, 2023

I got the same issue when using Axios with Electron. I found a solution to this. With Axios 1.4.0. I config adapter

axios({ URL: 'abc.com', ... adapter: 'http' })

The http adapter is not available in browsers.

NickHeiner added a commit to fixie-ai/ai-jsx that referenced this issue Jun 18, 2023
[Axios doesn't appear to support streaming in the browser](axios/axios#479), and since we want AI.JSX to be able to run in browser, that's a no-go for us.

[Loom showing streaming in the browser](https://www.loom.com/share/5bf174e2217b433fbf7901ece310774f)

We still don't stream the UI demos (e.g. recipe builder) because filling in UI pieces bit-by-bit could be worse than buffering.
@lukelu520
Copy link

The issue should be reopened since axios still doesn't support stream response. Correct me if I am wrong, hope to hear your expertise.

@emargin
Copy link

emargin commented Sep 4, 2023

BUMP .-.

@AkdM
Copy link

AkdM commented Oct 4, 2023

I couldn't make the responseType stream option work too, but I made a workaround with the onDownloadProgress option, like such:

axios({
  // ...options,
  onDownloadProgress: (evt) => {
    // Parse response from evt.event.target.responseText || evt.event.target.response
    // The target holds the accumulator + the current response, so basically everything from the beginning on each response
    // Note that it's evt.target instead of evt.event.target for older axios versions
  }
})

@psankar
Copy link

psankar commented Oct 29, 2023

// Parse response from evt.event.target.responseText || evt.event.target.response
// The target holds the accumulator + the current response, so basically everything from the beginning on each response

Thanks for this @AkdM . But do you know of a way to get only the new responses instead of the accumulated response since the beginning ?

@jonah-butler
Copy link

I couldn't make the responseType stream option work too, but I made a workaround with the onDownloadProgress option, like such:

axios({
  // ...options,
  onDownloadProgress: (evt) => {
    // Parse response from evt.event.target.responseText || evt.event.target.response
    // The target holds the accumulator + the current response, so basically everything from the beginning on each response
    // Note that it's evt.target instead of evt.event.target for older axios versions
  }
})

This is the same approach I took as well for collecting chunked responses from a stream. Though the evt param in the onDownloadProgress callback was creating some issues for me in TS. It seems the type that was added is declared as ProgressEvent. With that being the case, I had to cast evt.target(or evt.event.target) to an XMLHttpRequest before I could access the .response property, otherwise I get a Property 'response' does not exist on type 'EventTarget'. error.

onProgress(e: ProgressEvent): void {
      const req = e.target as XMLHttpRequest;
      this.textChunk = req.response;
}

async function streamedResponseHandler(
  cb: (e: ProgressEvent) => void
): Promise<void> {
  axios.get(
    "/endpoint",
    {
      onDownloadProgress: cb,
    }
  );
}

This callback approach was good for my use case in Vue so I could retain the reactivity of this.textChunk, which is a data property, so as this.textChunk was getting reassigned from req.response, all DOM updates would just happen.

Then just calling it:

streamedResponseHandler(onProgress);

@AkdM
Copy link

AkdM commented Nov 14, 2023

Interesting, thanks @jonah-butler. I am going to make a migration to TS very soon so that will help me.

@jonah-butler
Copy link

Thanks for your example too @AkdM. That was very helpful for me.

@ZulluBalti
Copy link

// Parse response from evt.event.target.responseText || evt.event.target.response
// The target holds the accumulator + the current response, so basically everything from the beginning on each response

Thanks for this @AkdM . But do you know of a way to get only the new responses instead of the accumulated response since the beginning ?

Hi, did you find any build-in elegant solution?

@jonah-butler
Copy link

@ZulluBalti I know this isn't a built-in, but in my example based on @AkdM, slicing the response at the previous length of this.textChunk is a straightforward way of capturing only the new data without modifying the response text.

onProgress(e: ProgressEvent): void {
      const req = e.target as XMLHttpRequest;
      console.log(req.response.slice(this.scriptChunk.length); // <---- only new data from each new response
      this.textChunk = req.response;
}

@ZulluBalti
Copy link

ZulluBalti commented Dec 20, 2023

I couldn't make the responseType stream option work too, but I made a workaround with the onDownloadProgress option, like such:

axios({
  // ...options,
  onDownloadProgress: (evt) => {
    // Parse response from evt.event.target.responseText || evt.event.target.response
    // The target holds the accumulator + the current response, so basically everything from the beginning on each response
    // Note that it's evt.target instead of evt.event.target for older axios versions
  }
})

Hi, I'm facing one problem with this approach.

Axios buffers the chunks and return multiple chunks.
Is there any way to run this function every time the server writes to the stream?

for example, in the backend I've this code

const chunks = ["Chunk 1 ", "Chunk 2 ", "Chunk 3"];
for  (const chunk of chunks) {
  res.write(chunk);
  await sleep(1000); // custom function to wait for 1 second
}
res.end("END")

I was expecting to get the chunks like this

"Chunk 1" when onDownloadProgress runs for the first time
Chunk1 Chunk2 when it runs for the 2nd time
Chunk1 Chunk2 Chunk3 when it runs for the 3rd time

But I'm getting it like this
Chunk 1Chunk 2 (multiple chunks at a time)

@brunolm
Copy link

brunolm commented Dec 20, 2023

as it's not possible to implement this using XHR.

I guess this reply is over 9 years old. ^

XHR + Streaming text:

var xhr = new XMLHttpRequest();
xhr.open('GET', '/endpoint', true);
xhr.timeout = 10000;
xhr.ontimeout = function() {
  console.error('Request timed out!');
};
xhr.onprogress = function() {
  var responseText = xhr.responseText;
  var chunk = responseText.slice(xhr.prevLen);
  xhr.prevLen = responseText.length;
  console.log(chunk);
};
xhr.onload = function() {
  console.log('Done!');
};
xhr.send();

Bonus: Using fetch

const response = await fetch("/endpoint", {
  signal: AbortSignal.timeout(10000),
});

const reader = response.body.getReader();

while (true) {
  const { done, value } = await reader.read();

  if (done) {
    console.log("Stream complete");
    break;
  }

  console.log(new TextDecoder().decode(value));
}

Axios:

Not supported. Said to be impossible in 2009, closed in 2016, and there's still no support for it today.

Next.js 14 endpoint:

import { NextApiResponse } from 'next'

// https://developer.mozilla.org/docs/Web/API/ReadableStream#convert_async_iterator_to_stream
function iteratorToStream(iterator: any) {
  return new ReadableStream({
    async pull(controller) {
      const { value, done } = await iterator.next()

      if (done) {
        controller.close()
      } else {
        controller.enqueue(value)
      }
    },
  })
}

function sleep(time: number) {
  return new Promise((resolve) => {
    setTimeout(resolve, time)
  })
}

const encoder = new TextEncoder()

async function* makeIterator() {
  yield encoder.encode('<p>One</p>')
  await sleep(700)
  yield encoder.encode('<p>Two</p>')
  await sleep(700)
  yield encoder.encode('<p>Three</p>')
}

export default async function handler(req, res: NextApiResponse<any>) {
  const iterator = makeIterator()
  const stream = iteratorToStream(iterator)

  return new Response(stream)
}

export const runtime = 'edge'

@leookun
Copy link

leookun commented Feb 27, 2024

why close

@Cheaterman
Copy link

Because the author has no intention to implement this feature on current Axios. Meanwhile #4209 is closed and #4477 is locked - and both are stale, last activity in 2022. If anyone still cares about this, I recommend you use fetch instead.

@mividtim
Copy link

Would the maintainer kindly hand maintenance over to someone willing to actually maintain the package? Fetch is garbage. The API is inferior, and it doesn't support streaming uploads at all.

@leookun
Copy link

leookun commented Feb 28, 2024

Because the author has no intention to implement this feature on current Axios. Meanwhile #4209 is closed and #4477 is locked - and both are stale, last activity in 2022. If anyone still cares about this, I recommend you use fetch instead.

fetch solved my problem。because it needs to be compatible with axios-interceptors, I have to implemented axios-like function with fetch , maybe I should use fetch from the beginning

@Thorvarium
Copy link

Thorvarium commented Mar 27, 2024

This should be reopened LOL. Shocking.

@Thorvarium
Copy link

@brunolm I don't think that the fetch version works. It is waiting for the entire response before starting to print

@brunolm
Copy link

brunolm commented Apr 2, 2024

@brunolm I don't think that the fetch version works. It is waiting for the entire response before starting to print

Did you set this?

export const runtime = 'edge'

Maybe it's your next version or whatever backend you're using that needs some config.

@gofish543
Copy link

This should be reopened LOL. Shocking.

Very shocking.

@oneyoung19
Copy link

oneyoung19 commented May 10, 2024

as it's not possible to implement this using XHR.

I guess this reply is over 9 years old. ^

XHR + Streaming text:

var xhr = new XMLHttpRequest();
xhr.open('GET', '/endpoint', true);
xhr.timeout = 10000;
xhr.ontimeout = function() {
  console.error('Request timed out!');
};
xhr.onprogress = function() {
  var responseText = xhr.responseText;
  var chunk = responseText.slice(xhr.prevLen);
  xhr.prevLen = responseText.length;
  console.log(chunk);
};
xhr.onload = function() {
  console.log('Done!');
};
xhr.send();

Bonus: Using fetch

const response = await fetch("/endpoint", {
  signal: AbortSignal.timeout(10000),
});

const reader = response.body.getReader();

while (true) {
  const { done, value } = await reader.read();

  if (done) {
    console.log("Stream complete");
    break;
  }

  console.log(new TextDecoder().decode(value));
}

Axios:

Not supported. Said to be impossible in 2009, closed in 2016, and there's still no support for it today.

Next.js 14 endpoint:

import { NextApiResponse } from 'next'

// https://developer.mozilla.org/docs/Web/API/ReadableStream#convert_async_iterator_to_stream
function iteratorToStream(iterator: any) {
  return new ReadableStream({
    async pull(controller) {
      const { value, done } = await iterator.next()

      if (done) {
        controller.close()
      } else {
        controller.enqueue(value)
      }
    },
  })
}

function sleep(time: number) {
  return new Promise((resolve) => {
    setTimeout(resolve, time)
  })
}

const encoder = new TextEncoder()

async function* makeIterator() {
  yield encoder.encode('<p>One</p>')
  await sleep(700)
  yield encoder.encode('<p>Two</p>')
  await sleep(700)
  yield encoder.encode('<p>Three</p>')
}

export default async function handler(req, res: NextApiResponse<any>) {
  const iterator = makeIterator()
  const stream = iteratorToStream(iterator)

  return new Response(stream)
}

export const runtime = 'edge'

The chatgpt may used XHR + streaming.

image image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
No open projects
v1.0.0
  
To do
Development

No branches or pull requests