New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(WIP): support stream trailer #4373
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The change that makes trailer
support promises and callbacks is very good. However, I don't think the sendStreamTrailer
change is correct due the memory spike.
A memory friendly solution is to use https://github.com/mcollina/cloneable-readable to create a clone of the original stream and pipe one as a response and another to the trailer generation logic. This would also imply that we must call the trailer function immediately and not at the end of the response. Unfortunately, this would be a breaking change.
The good news is that once reply.trailer
can be async, the stream support can be implemented in userland on top of it: create a clone of the stream in the main route handler and use that to compute the trailer value. This could be a good fit for a plugin:
const newStream = reply.wrapStreamWithTrailers(stream, {
Etag: function (reply, payload, cb) {
// fill in
setImmediate(cb, null, '42')
}
})
return newStream
lib/reply.js
Outdated
// trailer attach should be fire immediately | ||
// since Cloneable will block sending unless | ||
// all the cloned copy is properly attached | ||
sendTrailer(payload, res, reply) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unfortunately, this is the breaking change I mentioned.
This comment was marked as resolved.
This comment was marked as resolved.
ced70e9
to
670641f
Compare
test/reply-trailers.test.js
Outdated
fastify.get('/', function (request, reply) { | ||
reply.trailer('ETag', function (reply, payload, done) { | ||
t.same(typeof payload.pipe === 'function', true) | ||
payload.on('end', () => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
New issue comes, I have no way to know payload is ended when I use payload.pipe(res, { end: false })
in sendStream
|
||
const _trailers = Object.entries(reply[kReplyTrailers]).filter(([k, v]) => typeof v === 'function') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is there good choice without using .filter
?
The previous implementation using a single variables
will count wrong and have racing condition.
cc @Uzlopak maybe you will have some idea?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can implement the same with a for loop
I am closing this PR since I don't have more time spent on it.
I can give a hand if anyone would like to pick it up. Just ping me. |
We have quite a few months before we can v5. Maybe you can revisit this in a few months? |
Yes. One solution is a |
This pull request has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
Support stream trailer trail PR.
I open this PR in WIP for collecting feedback. For example, is there any better solution?
Checklist
npm run test
andnpm run benchmark
and the Code of conduct