New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add basic visual regression testing #158
Conversation
Example failure can be seen at #159:
This particular issue seems to be an overflow problem causing the horizontal width of the page to grow after the USWDS upgrade. The diff is not especially useful here, since the screenshots are so drastically different. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Love how we're using the deployed site instead of checking in large image assets to the repo
For "acknowledgement" I bet we could probably get pretty far with making the visual diff a separate CircleCI job, and not making it required, and making sure we leave some sort of "I'm acknowledging the diff error" comment when we merge?
Co-Authored-By: Zach Margolis <zbmargolis@gmail.com>
I like the idea, also in how it would show as a separate check entry, and in (I assume?) being a simple way to parallelize these long-running tests as being independent from the rest. As far as how to actually implement this between GitHub Checks and CircleCI conditions, is your thinking to allow this to show as a failure, but still be merge-able from within GitHub? Or were you thinking something different? It does remind me of another alternative I'd had in mind, which was on the same line of thinking of turning this less into an assertive failure, and more into an "FYI, these pages changed, in case you weren't expecting it". I'd even thought of making some sort of standalone GitHub Action which wouldn't actually integrate as a status check, but would instead simply leave a comment on the pull request with any diffs. In this way, it's hard to miss, but at the same time isn't treated as a failure per-se. That being said, it seemed it would involve a fair bit more effort to go down this route. |
Correct, I was thinking we'd make it a separate job. Since we can toggle required/optional per-job in GH, I am thinking we'd leave this particular job as optional and the rest of the CI as required And yes, seems like a great candidate for parallelization too |
Required when not using singular "build" job
Necessary for "commands" support See: https://discuss.circleci.com/t/circleci-2-1-config-overview/26057
Style guide not (currently) using Yarn
I've updated the CircleCI configuration to break down a few independent jobs in general alignment with the IDP (9397ea2, 1274cc2, 49d1a19, 3b90a27) I also added some documentation in 0bbb5a3 describing what I assume to be the expected workflow. I don't have access to the repository settings, so I can't configure this check as being optional. @zachmargolis Would you be able to take care of this? It might also be something which can only be done after merge, to allow for the new checks to take effect. |
New workflow will be to bypass failed result of visual regression test status check as implied acknowledgment of changes
New checks default to optional, so we're all good here 👍 |
Why: As a contributor to Login Design System, I want to compare visual impact of my changes on documented components prior to merge, so that I can have an increased sense of confidence that my changes will not introduce unexpected regressions.
Notably, this is being considered to help facilitate an upgrade of USWDS to the latest version, which may have unanticipated cascading effects on Login Design System.
Implementation Notes:
This builds upon existing testing infrastructure, leveraging support for Puppeteer, Jest, and CircleCI artifacts (source, documentation) to generate and store screenshot differences between the local working copy and the live documentation site. The missing piece introduced here is an image comparison library, of which pixelmatch was selected as an optimal candidate.
Alternatives Considered:
jest-image-snapshot
: Good candidate that fits well into model of Snapshot Testing. Snapshots require that the files be stored within the repository, which for binary files like images could inflate the download size of the repository over time. Since CircleCI artifact storage is already configured, it seems an ideal preference to reuse for image snapshots. Butjest-image-snapshot
did inspire the use of the same pixelmatch library.To do:
Workflow documentation, pending feedback. Especially...