Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hosted Suppression File #4723

Closed
jeremylong opened this issue Aug 2, 2022 · 12 comments
Closed

Hosted Suppression File #4723

jeremylong opened this issue Aug 2, 2022 · 12 comments
Assignees
Milestone

Comments

@jeremylong
Copy link
Owner

Due to situations like #4670, #4671, #4677, #4690 - ODC needs to be able to respond faster and provide an updated suppression file when situations like this occur.

  1. Implement a hosted suppression file - ODC will download the file to its local cache daily. This will be controllable by a switch so those working in an offline mode can disable the feature.
  2. Update the issue-ops for false positives so that if/when a core maintainer (@aikebah, @nhumblot, or myself (@ctxt)) gives the github action bot's comment a thumbs up on a the generated suppression rule (e.g., comment on #4722) the generated suppression rule is then copied into the hosted suppression file, the updated suppression file is tested (to ensure it does not break ODC by not being parsable for some reason), and then the updated file will be automatically published.
    • Note, this will allow the team at release time to copy the suppressions from the hosted file into the release.
    • A schedule for removing the suppressions from the hosted file into the base suppression rules will need to be determined. We don't want to force someone to upgrade, but we also don't want to maintain the rules in two places.
@albuch
Copy link
Contributor

albuch commented Aug 2, 2022

Please make this resilient against network failures and unavailability of the resource (use suppression file that is bundled instead).
We already have a long enough history of data source availability impacting DependencyCheck.

@OrangeDog
Copy link

Just to note that the bot-generated suppression is often not good, as it doesn't understand which aspect is wrong, or which additional versions and/or packages would be affected.

@aikebah
Copy link
Collaborator

aikebah commented Aug 8, 2022

I would say it's often fine, though sometimes multiple FP reports can be combined for a single rule. Many of the reports I could, after review take over the generated suppression, some I had to modify and some were plain wrong.
Having a quick path for the approval and processing of the obviously correct ones eases the process. Those that require special handling would follow a regular code-and-PR process, but the obvious ones would be as plain and simple (or even simpler I guess) as various dependabot PR's that I was able to quickly merge in after changelog review from a mobile device by pushing a few approve/merge buttons

@jeremylong
Copy link
Owner Author

The generated suppression file based on approving FP reports can be found here: https://jeremylong.github.io/DependencyCheck/suppressions/publishedSuppressions.xml

@aikebah
Copy link
Collaborator

aikebah commented Sep 13, 2022

@jeremylong did you already start efforts on part 2 - building the usage and caching of the hosted suppression file? Or is that an activity still to be started?

@jeremylong
Copy link
Owner Author

I have not started building that part of the solution yet - planned for the 8.0.0 release. If you want to take that - great. Otherwise I'll get to it as I can ;)

@aikebah
Copy link
Collaborator

aikebah commented Sep 20, 2022

@jeremylong I'll start working on implementing that part for the 8.0.0 milestone soon... first I'll triage a few of the outstanding FP reports so that our users can already start consuming our hosted suppression file by manual configuration of an additional suppression-file.

@jeremylong
Copy link
Owner Author

@aikebah thanks!

@aikebah
Copy link
Collaborator

aikebah commented Sep 24, 2022

First draft implementation is nearing completion, expect to push the branch as a work-in-progress today, need to see whether I also manage to incorporate setting of the configurable parts (download frequency, hosted suppressions URL) into the advanced settings of the integrations, otherwise those will follow at a later time.

@aikebah
Copy link
Collaborator

aikebah commented Sep 24, 2022

@jeremylong feel free to already comment on my initial implementation. The pending work is adding configurability of the flags added to the Settings to the various integrations and extending the documentation to take this new 'internet required' functionality into account where applicable. The core engine integration of using the hosted suppressions file as published on the use-hosted-suppressions branch I consider complete.

@aikebah aikebah added this to the 8.0.0 milestone Sep 25, 2022
@jeremylong
Copy link
Owner Author

@aikebah quick review - looks good. I've generally added the default values for the settings to the properties files themselves; but having the defaults in the java code is fine as well. The only other thing I might question is the use of the cached data source directly in the analyzer - especially considering how this may play out on a large multi-module maven project setup to run on all modules.

@aikebah
Copy link
Collaborator

aikebah commented Oct 1, 2022

The only other thing I might question is the use of the cached data source directly in the analyzer

You'd rather see it operate on a copy of the resource like the RetireJS and CVE database? Would be fine with me as well, but thought that with the one-time-init in rules loading I could spare the extra I/O and space-usage of the copy action. I don't mind adding in that defensive copy just in case, will do so after adding the configuration to the

In any case I have a separate issue/PR cooking to update the downloader to take a defensive approach for updating the web resources by not downloading in-place, but downloading to a tempfile inside the datadirectory that replaces the destination only at the end of a successful download - I think that would get rid of most risks of a corrupted web resource (I would expect most corruptions to be an empty file due to a started but failed download)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants