Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Continue" response can interleave with response data #176

Open
Diggsey opened this issue Jan 24, 2021 · 7 comments
Open

"Continue" response can interleave with response data #176

Diggsey opened this issue Jan 24, 2021 · 7 comments
Assignees

Comments

@Diggsey
Copy link
Contributor

Diggsey commented Jan 24, 2021

Example:

  • Client sends "Expect" header
  • Server begins writing response
  • Server reads from request body for the first time
  • Background task sends "100 Continue" response in the middle of a half-written response

There are few possible solutions:

  1. Make it an error to read from the request body after starting to write a response. This is not great because eg. an echo server might want to stream data back without buffering it in memory.

  2. Send "100 Continue" before either a read from the request body, or when starting to write the response. Only omit it if the server explicitly drops the request body before beginning to send a response.

  3. Some other way of having the server make an explicit choice of whether the client should continue sending the body.

@jbr
Copy link
Member

jbr commented Jan 24, 2021

Could you include an example of code that does this? I'm curious how you're writing a response before reading from the body

@Diggsey
Copy link
Contributor Author

Diggsey commented Jan 25, 2021

I don't have an example to hand, but I can produce one if that would help. I believe it should be fairly easy to reproduce:

  • In the request handler, take the Body from the request and put it on the response.
  • That's it.

The server code will begin by encoding the response headers into the stream, and will then begin reading from the body. This will in turn trigger a read from the request body (since they are one and the same) which will result in the 100 Continue message being written into the middle of the response.

@yoshuawuyts
Copy link
Member

Send "100 Continue" before either a read from the request body, or when starting to write the response. Only omit it if the server explicitly drops the request body before beginning to send a response.

It seems the ordering might have been messed up here a bit. The design was intended so that the moment we start to read the request body, we write out 100: Continue to the response.

But you're right to point out that if we start writing to the response stream, then read the body, 100: Continue will be sent midway through writing the response. The fix seems to indeed also write 100: Continue the moment the response starts writing, so either reading from the request or writing to the response triggers 100: Continue.

@Diggsey
Copy link
Contributor Author

Diggsey commented Jan 25, 2021

The fix seems to indeed also write 100: Continue the moment the response starts writing, so either reading from the request or writing to the response triggers 100: Continue.

Won't this result in the 100: Continue always being sent?

FWIW, I have an experimental branch which solves several of these related issues (and fixes pipelining), I'll open a draft PR with that soon (there's one test failing right now).

@yoshuawuyts
Copy link
Member

Won't this result in the 100: Continue always being sent?

Oh, I was thinking this would be done through two atomic bools (or equivalent).

  1. Detect whether a 100: continue needs to be sent when parsing the request headers and set a bool. This prevents it from being sent if not needed.
  2. if a continue needs to be sent, either on first read from request, or on first write to response flip the other bool and send the 100: Continue. This prevents it from being sent twice.

FWIW, I have an experimental branch which solves several of these related issues (and fixes pipelining), I'll open a draft PR with that soon (there's one test failing right now).

Ohhh, neat!

@Diggsey
Copy link
Contributor Author

Diggsey commented Jan 25, 2021

Oh, I was thinking this would be done through two atomic bools (or equivalent).

Sorry I didn't mean always sent, rather, even if the "Expect" header is sent, there is an optimization that if the body is never read, we don't send the continue message at all. If we send the continue message on the first write, then that optimization won't ever kick in.

In my branch, I prevent this by having further writes to the stream block until a decision has been made about whether to send the "100 Continue" or not (if you drop the Body without reading it, then it will not send it, if you start reading from it, it will send it). However, this only an improvement: misuse (by attempting to write before doing either of those things) will result in a deadlock rather than a corrupted response,

One possible solution other than those listed above would be to transparently buffer the writes in memory until the request body is either read or dropped. I'm leaning towards this one right now.

@Diggsey
Copy link
Contributor Author

Diggsey commented Jan 26, 2021

there's one test failing right now

This seems to be caused by the ChunkedDecoder sometimes reading past the end of the request. It should probably have a BufRead bound instead of Read so that it can avoid this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants