Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: emit diagnostics_channel events upon routing request #5252

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

tlhunter
Copy link
Contributor

@tlhunter tlhunter commented Jan 3, 2024

This adds support for emitting diagnostics channel events for a request. It's an extension of #5105.

The dc-polyfill package provides a polyfill of the diagnostics_channel module for older versions of Node.js. This is why the existing diagnostics channel call has been removed from a conditional in fastify.js. Note that dc-polyfill falls back to the existing diagnostics channel if present. That's why the tests are requiring diagnostics channel directly instead of the polyfill to show that it works as expected.

Here's some things I'm wondering about:

  • extracting payload from reply.send()
  • rebased on main
  • tests
  • docs

Benchmark results

main

┌─────────┬──────┬──────┬───────┬───────┬────────┬─────────┬────────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%   │ Avg    │ Stdev   │ Max    │
├─────────┼──────┼──────┼───────┼───────┼────────┼─────────┼────────┤
│ Latency │ 4 ms │ 9 ms │ 13 ms │ 13 ms │ 8.3 ms │ 4.19 ms │ 365 ms │
└─────────┴──────┴──────┴───────┴───────┴────────┴─────────┴────────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬────────────┬──────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg        │ Stdev    │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼────────────┼──────────┼─────────┤
│ Req/Sec   │ 104,639 │ 104,639 │ 114,495 │ 116,735 │ 113,796.27 │ 2,451.61 │ 104,582 │
├───────────┼─────────┼─────────┼─────────┼─────────┼────────────┼──────────┼─────────┤
│ Bytes/Sec │ 19.7 MB │ 19.7 MB │ 21.5 MB │ 22 MB   │ 21.4 MB    │ 461 kB   │ 19.7 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴────────────┴──────────┴─────────┘

this change

┌─────────┬──────┬──────┬───────┬───────┬─────────┬─────────┬────────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%   │ Avg     │ Stdev   │ Max    │
├─────────┼──────┼──────┼───────┼───────┼─────────┼─────────┼────────┤
│ Latency │ 4 ms │ 9 ms │ 13 ms │ 13 ms │ 8.21 ms │ 4.23 ms │ 364 ms │
└─────────┴──────┴──────┴───────┴───────┴─────────┴─────────┴────────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬───────────┬──────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg       │ Stdev    │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼──────────┼─────────┤
│ Req/Sec   │ 107,455 │ 107,455 │ 115,455 │ 117,503 │ 114,822.4 │ 2,261.46 │ 107,414 │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼──────────┼─────────┤
│ Bytes/Sec │ 20.2 MB │ 20.2 MB │ 21.7 MB │ 22.1 MB │ 21.6 MB   │ 426 kB   │ 20.2 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴───────────┴──────────┴─────────┘

Checklist

@tlhunter
Copy link
Contributor Author

tlhunter commented Jan 3, 2024

/cc @Qard

Copy link

@Qard Qard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left a few suggestions. Basically try/finally to more reliably ensure end events happen. Should probably also just put a catch in there too to be sure any unexpected errors get reported too.

lib/handleRequest.js Outdated Show resolved Hide resolved
reply[kReplyIsError] = true
reply.send(err)
channels.end.publish(store)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rather than putting the end in all the branches you could use try/finally. That also ensures the event still fires if something throws.

@@ -28,6 +33,8 @@ function wrapThenable (thenable, reply) {
reply.send(err)
}
}

if (channels) channels.asyncEnd.publish(store)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should do try/finally here too.

Copy link
Member

@mcollina mcollina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

given dc-polyfill, I think we can land this on main if it does not introduce overhead when disabled.

fastify.js Outdated
} catch (e) {
// This only happens if `diagnostics_channel` isn't available, i.e. earlier
// versions of Node.js. In that event, we don't care, so ignore the error.
const dc = require('dc-polyfill')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

move this to the top of the file

Copy link
Member

@jsumners jsumners Jan 4, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, I'd rather it be named at least diagnostics instead of dc in order to avoid having to lookup what "dc" means.

lib/wrapThenable.js Show resolved Hide resolved
@mcollina
Copy link
Member

mcollina commented Jan 4, 2024

Is there a way to get the payload sent via reply.send()?

inside reply.send() or when executing the onSend hook.

@Uzlopak
Copy link
Contributor

Uzlopak commented Jan 4, 2024

What about the diagnostics_channel plugin?

docs/Reference/Hooks.md Outdated Show resolved Hide resolved
docs/Reference/Hooks.md Outdated Show resolved Hide resolved
docs/Reference/Hooks.md Outdated Show resolved Hide resolved
@Uzlopak
Copy link
Contributor

Uzlopak commented Jan 4, 2024

Clarifying my last comment. How does this PR relate to

https://github.com/fastify/fastify-diagnostics-channel

?

Seems like this PR makes the plugin partially redundant?

@Qard
Copy link

Qard commented Jan 4, 2024

Definitely overlaps with the plugin, though this is more limited to just producing a trace around a request whereas the plugin gives a bit more control. This might benefit from a few more events to cover the other use cases provided by the plugin.

The main benefit I see for this existing rather than just the plugin is that if it lives directly in the library it enables automatic instrumentation tracers to listen without producing side effects while adding the plugin could have user-visible behaviour. There's also probably performance advantages to going direct rather than going through the plugin.

@jsumners
Copy link
Member

jsumners commented Jan 4, 2024

I think instrumentation hooks such as these should be in core. They provide a way for other instrumentations to do things via a specific plugin or wrapping package. If we were trying to add such hooks via some third party library, like dtrace, then my opinion would be different. But the API used in this PR is meant to be a core API.

@tlhunter
Copy link
Contributor Author

tlhunter commented Jan 4, 2024

What about the diagnostics_channel plugin?

@Uzlopak I think you're asking about the removal of the diagnostics_channel npm package?

If so, the dc-polyfill package makes the diagnostics_channel package obsolete. The way diagnostics_channel worked is that it was a snapshot of the internal diagnostics_channel module that was made into an npm package of the same name. On ancient Node.js versions where the internal diagnostics_channel module was missing a call to require('diagnostics_channel') would pickup the npm package and that same call on newer versions of Node.js simply retrieves the internal module. However there were shortcomings with the diagnostics_channel package, like if multiple versions of the diagnostics_channel package existed in the a node_modules/ dependency tree then they wouldn't cross communicate. Also the diagnostics_channel package hasn't been maintained and is missing newer features that this PR depends on. dc-polyfill alleviates those issues and many more.

If anyone is curious please checkout the dc-polyfill docs for more info. Ultimately dc-polyfill is a drop-in replacement for both the diagnostics_channel npm package and internal module

@Qard
Copy link

Qard commented Jan 4, 2024

I think instrumentation hooks such as these should be in core. They provide a way for other instrumentations to do things via a specific plugin or wrapping package. If we were trying to add such hooks via some third party library, like dtrace, then my opinion would be different. But the API used in this PR is meant to be a core API.

It is a core API. The dc-polyfill API just enables supporting the newer features beyond the limited backporting window of Node.js core. With dc-polyfill we had it tested and working all the way back to Node.js 12. It uses the core API where available though, so it should always be a drop-in replacement for using the core API as if using the current version of Node.js.

As for the plugin versus embedding in the library, I much favour embedding in the library as the diagnostics_channel API (and dc-polyfill on top of it) are aimed at very high performance and as close to zero cost as possible when nothing is listening. Abstracting that with a plugin in the middle adds a bunch of unnecessary overhead as fastify then loses the context of what is ignorable as the plugin barrier isn't designed to forward that information.

I created diagnostics_channel and it is quite specifically my intent that it be used directly in libraries in this way as it gets us as close to the source as possible feed of diagnostics data, eliminating a ton of instrumentation cost in the process and making gathering instrumentation insights much, much easier in the process. The existing model of monkey-patching everything is highly unreliable and very unapproachable to an average developer to be able to use as a way to gain insight about their applications. I intended for diagnostics_channel to encourage library developers to start thinking about exposing diagnostics data directly and thinking about the performance profile of their code more.

@tlhunter
Copy link
Contributor Author

tlhunter commented Jan 5, 2024

inside reply.send() or when executing the onSend hook.

@mcollina I'm now grabbing the payload within Reply instances and setting it on itself using a new symbol: 92f77e4

That said I'm not sure if there are expectations on how long Reply instances stick around and if there may be memory concerns. Let me know if this approach isn't ideal and I'll try something else.

@Uzlopak
Copy link
Contributor

Uzlopak commented Jan 5, 2024

I wonder if we should integrate then the whole diagnostics-channel stuff from the plugin into core.

lib/reply.js Outdated
@@ -66,6 +67,7 @@ function Reply (res, request, log) {
this[kReplyTrailers] = null
this[kReplyHasStatusCode] = false
this[kReplyStartTime] = undefined
this[kReplyPayload] = undefined
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only concern is when the payload is stream.
How can the subscriber safely consume the stream for logging?
The same problem exist when I implement the Trailer feature and it is not resolved until now.
Reference #4373

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can see so many ways this can end badly long term. Can you avoid storing the payload inside reply?

Copy link
Contributor Author

@tlhunter tlhunter Jan 5, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The latest commit no longer attaches the payload value to the Reply instance: 325ba54

@mcollina
Copy link
Member

mcollina commented Jan 5, 2024

I think you can target main for this.

@mcollina
Copy link
Member

mcollina commented Jan 5, 2024

I wonder if we should integrate then the whole diagnostics-channel stuff from the plugin into core.

Agreed

@tlhunter tlhunter force-pushed the tlhunter/diagnostics-channel branch from 4b3a0cb to 1b00702 Compare January 5, 2024 19:48
@tlhunter tlhunter changed the base branch from next to main January 5, 2024 19:49
@@ -158,7 +163,6 @@ function preHandlerCallback (err, request, reply) {
if (result !== null && typeof result.then === 'function') {
wrapThenable(result, reply, channels, store)
} else {
store.result = result
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line becomes redundant as the kReplyOnSend callback sets the value.

@tlhunter
Copy link
Contributor Author

tlhunter commented Jan 5, 2024

I think you can target main for this.

This PR is now based on main. Ended up squashing due to conflicts.

Copy link
Member

@mcollina mcollina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good work! I think a few edge cases are missing:

  1. reply.callNotFound() // this will execute the not found handler
  2. Fastify error handlers (nested)
  3. routes with setImmediate(() => reply.send('hello world))
  4. async routes with setImmediate(() => reply.send('hello world)); return reply

The reason why I'm asking is that it will definitely pass through reply.send() multiple times, and both synchronously and asynchronously.

@@ -5,28 +5,37 @@ const {
kReplyHijacked
} = require('./symbols')

function wrapThenable (thenable, reply) {
function wrapThenable (thenable, reply, channels, store) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this function should be modified, I would recommend to keep the change between preHandlerCallback and reply.send().
Specifically, I would emit asyncStart and asyncEnd within reply.send(), as there are multiple ways to call that function (synchronous and asynchronous)

@tlhunter
Copy link
Contributor Author

@mcollina So far it seems that by returning a simple value from a sync request handler that value is used as the response. Or, if a request handler returns a promise, the resolved value from that promise is used as the response. Or, if at any time reply.send() is used, then that value is used as the response. Is it safe to assume that those are the canonical ways that Fastify determines if a response is finished?

@mcollina
Copy link
Member

Yes. The error flow resets all this for the error handler.

@Uzlopak
Copy link
Contributor

Uzlopak commented Jan 11, 2024

@mcollina

Should this PR also integrate the plugins behaviour? Or should this be a follow up?

@mcollina
Copy link
Member

@mcollina

Should this PR also integrate the plugins behaviour? Or should this be a follow up?

Can you clarify?

@tlhunter
Copy link
Contributor Author

@mcollina I think @Uzlopak is asking if I should copy all of the code from the existing fastify-diagnostics-channel package and incorporate it into my PR. I'm of the opinion that it shouldn't be included in this PR. While that package is semi-related in scope, it uses a different pattern, and this PR is just attempting to implement the TracingChannel "spec". Plus I'm going to write some blog posts that link to this PR to share with the community and I'd like to keep the PR as succinct and isolated as possible to avoid confusing the audience.

@mcollina
Copy link
Member

A follow up would be good, this would be massive already after the fixes I mentioned.

@tlhunter
Copy link
Contributor Author

@mcollina Just so I know if I'm on the right track, do you think I need to refactor a bunch of Fastify to achieve those changes?

@mcollina
Copy link
Member

No, but possibly change the approach you have taken with this PR.

@tlhunter tlhunter marked this pull request as ready for review February 26, 2024 23:31
}

let result
if (context[kFourOhFourContext] === null) {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mcollina I was able to fix the issue with 404 routes executing twice with this check. But I'm wondering if the solution is now too-specific to handling 404 errors and if other situations can trigger the same problem?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Possibly the error path, but I think you handled that already.

Copy link
Member

@mcollina mcollina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good job, just one nit and this should be good to go

}

let result
if (context[kFourOhFourContext] === null) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Possibly the error path, but I think you handled that already.

route: {
url: context.config.url,
method: context.config.method
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we avoid allocating those two objects if there are no diagnostics channels?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The creation of the store object is now wrapped in a check to see if there are any listeners on any of the channels.

@tlhunter tlhunter force-pushed the tlhunter/diagnostics-channel branch 7 times, most recently from a181582 to 8e1b54b Compare March 4, 2024 17:47
@tlhunter
Copy link
Contributor Author

tlhunter commented Mar 4, 2024

@mcollina I modified the code to check for the presence of listeners on any of the DC channels. If there aren't any listeners then the code skips the store object creation paths and also skips the runStores() function call which adds a closure. Here's an updated benchmark run:

main

┌─────────┬──────┬──────┬───────┬───────┬─────────┬─────────┬───────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%   │ Avg     │ Stdev   │ Max   │
├─────────┼──────┼──────┼───────┼───────┼─────────┼─────────┼───────┤
│ Latency │ 4 ms │ 9 ms │ 13 ms │ 13 ms │ 8.08 ms │ 2.84 ms │ 62 ms │
└─────────┴──────┴──────┴───────┴───────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬────────────┬──────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg        │ Stdev    │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼────────────┼──────────┼─────────┤
│ Req/Sec   │ 105,535 │ 105,535 │ 117,695 │ 120,639 │ 116,494.94 │ 3,278.49 │ 105,482 │
├───────────┼─────────┼─────────┼─────────┼─────────┼────────────┼──────────┼─────────┤
│ Bytes/Sec │ 19.8 MB │ 19.8 MB │ 22.1 MB │ 22.7 MB │ 21.9 MB    │ 619 kB   │ 19.8 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴────────────┴──────────┴─────────┘

tlhunter/diagnostics-channel

┌─────────┬──────┬──────┬───────┬───────┬─────────┬─────────┬───────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%   │ Avg     │ Stdev   │ Max   │
├─────────┼──────┼──────┼───────┼───────┼─────────┼─────────┼───────┤
│ Latency │ 4 ms │ 9 ms │ 13 ms │ 13 ms │ 8.01 ms │ 2.78 ms │ 56 ms │
└─────────┴──────┴──────┴───────┴───────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬────────────┬──────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg        │ Stdev    │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼────────────┼──────────┼─────────┤
│ Req/Sec   │ 103,295 │ 103,295 │ 118,847 │ 121,023 │ 117,486.94 │ 3,489.57 │ 103,245 │
├───────────┼─────────┼─────────┼─────────┼─────────┼────────────┼──────────┼─────────┤
│ Bytes/Sec │ 19.4 MB │ 19.4 MB │ 22.3 MB │ 22.8 MB │ 22.1 MB    │ 656 kB   │ 19.4 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴────────────┴──────────┴─────────┘

Copy link

@Qard Qard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, just some minor nits, but feel free to ignore them. 😄


if (result !== undefined) {
if (result !== null && typeof result.then === 'function') {
if (store) store.async = true
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might as well put this inside wrapThenable if you're passing in store already anyway. 🤔

Comment on lines 59 to 64
// try-catch allow to re-throw error in error handler for async handler
try {
reply.send(err)
} catch (err) {
reply.send(err)
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could probably just raise this try/catch into a merge with the outer try/finally.

Copy link
Member

@mcollina mcollina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@mcollina
Copy link
Member

mcollina commented Mar 5, 2024

Can you run a final bench?

@tlhunter
Copy link
Contributor Author

tlhunter commented Mar 5, 2024

@mcollina I ran it a couple times and the results are always about the same:

$ clear ; gc tlhunter/diagnostics-channel && npm run benchmark && gc main && npm run benchmark
Switched to branch 'tlhunter/diagnostics-channel'
Your branch is up to date with 'tlhunter/tlhunter/diagnostics-channel'.

> fastify@4.26.2 benchmark
> concurrently -k -s first "node ./examples/benchmark/simple.js" "autocannon -c 100 -d 30 -p 10 localhost:3000/"

[1] Running 30s test @ http://localhost:3000/
[1] 100 connections with 10 pipelining factor
[1]
[1]
[1] ┌─────────┬──────┬──────┬───────┬───────┬─────────┬─────────┬───────┐
[1] │ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%   │ Avg     │ Stdev   │ Max   │
[1] ├─────────┼──────┼──────┼───────┼───────┼─────────┼─────────┼───────┤
[1] │ Latency │ 4 ms │ 9 ms │ 12 ms │ 13 ms │ 7.77 ms │ 2.73 ms │ 55 ms │
[1] └─────────┴──────┴──────┴───────┴───────┴─────────┴─────────┴───────┘
[1] ┌───────────┬─────────┬─────────┬─────────┬─────────┬────────────┬──────────┬─────────┐
[1] │ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg        │ Stdev    │ Min     │
[1] ├───────────┼─────────┼─────────┼─────────┼─────────┼────────────┼──────────┼─────────┤
[1] │ Req/Sec   │ 104,511 │ 104,511 │ 122,175 │ 124,735 │ 120,834.14 │ 4,364.21 │ 104,463 │
[1] ├───────────┼─────────┼─────────┼─────────┼─────────┼────────────┼──────────┼─────────┤
[1] │ Bytes/Sec │ 19.6 MB │ 19.6 MB │ 23 MB   │ 23.5 MB │ 22.7 MB    │ 822 kB   │ 19.6 MB │
[1] └───────────┴─────────┴─────────┴─────────┴─────────┴────────────┴──────────┴─────────┘
[1]
[1] Req/Bytes counts sampled once per second.
[1] # of samples: 30
[1]
[1] 3626k requests in 30.02s, 681 MB read
[1] autocannon -c 100 -d 30 -p 10 localhost:3000/ exited with code 0
--> Sending SIGTERM to other processes..
[0] node ./examples/benchmark/simple.js exited with code SIGTERM
Switched to branch 'main'
Your branch is up to date with 'origin/main'.

> fastify@4.26.2 benchmark
> concurrently -k -s first "node ./examples/benchmark/simple.js" "autocannon -c 100 -d 30 -p 10 localhost:3000/"

[1] Running 30s test @ http://localhost:3000/
[1] 100 connections with 10 pipelining factor
[1]
[1]
[1] ┌─────────┬──────┬──────┬───────┬───────┬─────────┬─────────┬───────┐
[1] │ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%   │ Avg     │ Stdev   │ Max   │
[1] ├─────────┼──────┼──────┼───────┼───────┼─────────┼─────────┼───────┤
[1] │ Latency │ 4 ms │ 9 ms │ 12 ms │ 13 ms │ 7.95 ms │ 2.77 ms │ 61 ms │
[1] └─────────┴──────┴──────┴───────┴───────┴─────────┴─────────┴───────┘
[1] ┌───────────┬─────────┬─────────┬─────────┬─────────┬───────────┬──────────┬─────────┐
[1] │ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg       │ Stdev    │ Min     │
[1] ├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼──────────┼─────────┤
[1] │ Req/Sec   │ 106,495 │ 106,495 │ 118,591 │ 121,727 │ 118,182.4 │ 3,054.91 │ 106,448 │
[1] ├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼──────────┼─────────┤
[1] │ Bytes/Sec │ 20 MB   │ 20 MB   │ 22.3 MB │ 22.9 MB │ 22.2 MB   │ 574 kB   │ 20 MB   │
[1] └───────────┴─────────┴─────────┴─────────┴─────────┴───────────┴──────────┴─────────┘
[1]
[1] Req/Bytes counts sampled once per second.
[1] # of samples: 30
[1]
[1] 3546k requests in 30.02s, 667 MB read
[1] autocannon -c 100 -d 30 -p 10 localhost:3000/ exited with code 0
--> Sending SIGTERM to other processes..
[0] node ./examples/benchmark/simple.js exited with code SIGTERM

@mcollina
Copy link
Member

mcollina commented Mar 7, 2024

@jsumners @Eomm ptal

Copy link
Member

@jsumners jsumners left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me. Just a note for upstream API design.

lib/handleRequest.js Show resolved Hide resolved
@Uzlopak
Copy link
Contributor

Uzlopak commented Mar 9, 2024

What i really dont understand is that according to @Qard the publish function is a noop when no subscriber exist. So then why do we even have to check for hasSubscribers?

@Qard
Copy link

Qard commented Mar 9, 2024

The hasSubscribers helper is for the publisher to be able to skip message construction if they are creating a new object for the message. If they are just passing an existing object it's not necessary, but here we are creating a new object.

@tlhunter
Copy link
Contributor Author

What i really dont understand is that according to @Qard the publish function is a noop when no subscriber exist. So then why do we even have to check for hasSubscribers?

It was to address Matteo's object creation concern: #5252 (comment)

@tlhunter
Copy link
Contributor Author

FYI, I'll probably have @Qard address any follow up concerns with this PR.

@jsumners
Copy link
Member

Can we get the helper function added to dc_polyfill and then utilized in this PR before merging?

@Qard
Copy link

Qard commented Mar 19, 2024

The PR for it is here: DataDog/dc-polyfill#10. Needs some work yet as tests were failing on some versions though. Haven't yet identified why as I've been busy with other things for a bit. 😅

@mcollina
Copy link
Member

mcollina commented May 7, 2024

Any updates on this? It would be good to land it for v5.

@mcollina
Copy link
Member

mcollina commented May 7, 2024

In v5, our minimum supported version will be latest v18.

@tlhunter
Copy link
Contributor Author

tlhunter commented May 7, 2024

I'm back from parental leave and will be working on landing this PR again. It looks like @Qard was working on landing a change in dc-polyfill but hit a roadblock. I'll look at his PR and see if I can help out there.

Other than that it looks like things are good to go.

I did just do a rebase on main and it looks like there are some failures about package.version mismatching fastify's version, not sure if that's related to this change though.

@tlhunter tlhunter force-pushed the tlhunter/diagnostics-channel branch from 8d204d6 to 69a421d Compare May 7, 2024 19:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

8 participants