Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failing unit test #196

Closed
titanabrian opened this issue Dec 4, 2020 · 3 comments · Fixed by #283
Closed

Failing unit test #196

titanabrian opened this issue Dec 4, 2020 · 3 comments · Fixed by #283

Comments

@titanabrian
Copy link

titanabrian commented Dec 4, 2020

I tried to clone the project and run test in my local. But I got several tests are failing.
Im using MacOs with zsh command line

Screen Shot 2020-12-05 at 00 36 42
Screen Shot 2020-12-05 at 00 36 33

@speedytwenty
Copy link
Collaborator

Spent some time debugging this and I was able to trace it to a difference in versions of the "colors" module between cli-table and cli-table3.

It's a bit tricky to debug for a couple reasons:

  1. The tests that are failing are tests that run against cli-table and cli-table3. The tests are failing with cli-table and NOT cli-table3.
  2. The failing tests are specifically testing colors, and while jest tries to be helpful with highlighting the failures, it's difficult to actually see what's going on. If the visual appearance of the difference were accurate, both values would appear identical.

If you run: npm list colors you will see something like:

├─┬ cli-table@0.3.6
│ └── colors@1.0.3
└── colors@1.4.0

When I run: npm i colors@1.0.3 && npm test the tests that utilize the colors module in their expected values pass while tests that have hard-coded colors in their expected value fail.

So it's clear that colors is likely producing two different outputs based on the versions, but the actual difference is invisible.

This failing test does not indicate a functional bug with cli-table3 (nor cli-table). I'm not exactly sure the best route to go about resolving this to get the tests passing again.

I tend to avoid modifying tests as much as possible, but it might make sense to mock colors and test that it was called as expected rather than comparing the output. Alternatively, it might make sense to remove the tests against cli-table altogether as it is testing external code which cli-table sufficiently covers in it's tests.

@speedytwenty
Copy link
Collaborator

speedytwenty commented May 4, 2021

After a closer look, considering the CI builds are passing. I discovered that the tests pass when ran with the "--runInBand" flag. I think this must result in the later version of colors being used throughout the test but I can't tell for sure.

Try npm test -- --runInBand OR yarn test --runInBand

speedytwenty added a commit to speedytwenty/cli-table3 that referenced this issue Mar 24, 2022
speedytwenty added a commit to speedytwenty/cli-table3 that referenced this issue Mar 29, 2022
@speedytwenty
Copy link
Collaborator

image

The failing tests are caused by differences in output produced by the (obscure) "colors" module(s). Yet somehow, they didn't fail before with Travis, nor now with Github Actions. But if you download master, before this is resolved, and run yarn install && yarn test the above test will fail.

To be clear, this existed before cli-table3 moved to @colors/colors...and all that fiasco. Any actual differences in colors output expected by the cli-table3 "legacy" tests while testing cli-table will cause the tests to fail.

I attempted a solution for this in #196 and realized it was inadequate—despite preventing the failing errors—the --runInBand flag may have unobservable or superfluous consequences.

The appropriate solution will be to mock the color module(s) within the tests implicated. But from there, the question becomes which version(s) of cli-table is cli-table3 attempting to support?

speedytwenty added a commit to speedytwenty/cli-table3 that referenced this issue Mar 30, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants