New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Same level multistream logging is not working with "dedupe" #1167
Comments
The current implementation works as expected. If you would like to add a new feature, I'll be happy to review a PR. |
Thanks for the response! How would you imagine this feature to be implemented? A: Changing the 'if' condition and removing the If you like the first one, and it will pass all the tests like before, I would be glad to send a PR, but if you would like to see new options, new tests and additions to the docs, I'm afraid I have to hand this task over to someone with more experience. |
Well, dedupe works as expected, so it's a new option. No worries, I'll close this then. |
All right, I understand if you don't want to change that and in this case, maybe I will try to create a PR for this some day. |
ah, I might have missed the problem then. a PR with A) will be good. |
Sounds great, then here is a PR for that: |
This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
If I set
dedupe
to true, the option works as stated in the docs, but it also breaks the possibility to have multiple streams for the same level, as only the "first" stream of the given level will be used, not all the streams matching the level. I'm not sure if this is the intended way of working or this is something you might want to change. I understand that the way it currently works exactly means "deduping" but in my (and I think in most) use cases, we would expect dedupe to "send logs only to the stream with the higher level" but to all of them.I think changing the current multistream write function to the following would do the trick:
The text was updated successfully, but these errors were encountered: