Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spike: can we improve the failure summary / visibility of the test suite #1154

Open
joe-kimmel-vmw opened this issue Jul 14, 2023 · 1 comment
Labels
help wanted Need some extra hands to the this done. status/ready type/chore

Comments

@joe-kimmel-vmw
Copy link
Contributor

Description

Lifecycle has so many tests that the slevine/spec test runner output is hard to read.

There's an entire secondary tool dedicated to helping you parse the output the tests: https://github.com/aemengo/gswt , but it doesn't seem to work out of the box for most people and that raises the bar for casual / "drive-by" contributors. At the same time, the fact that it exists at all is a clear signal that there's room for improvement in the output.

Right now one of the best ways to find errors e.g. in the github test output or on your own terminal is to successively search for "Failed: 1", "Failed: 2", "Failed: 3", ... until you find which tests failed.

Proposed solution

This isn't a proposal yet . This issue is a placeholder to do the work of testing whether there are any 'quick wins' we can make.

One thought would be to summarize at the bottom, e.g. by capturing and then listing the names (and files?) of the first N tests that failed. Would we include the full output and failure messages of those tests, or just their names? What's a good value of N? is this even possible?

Another approach might be to suppress passing tests so that the failures are the only thing that prints (via a flag? by default?).

There's probably some other ideas and approaches also - this issue is not intended to prescribe, but rather to suggest that exploration could be helpful.

@dlion
Copy link
Member

dlion commented Jul 25, 2023

Some food for thoughts:
I tried this tool: https://github.com/vakenbolt/go-test-report
and it works quite well, here an example running our unit tests on a failure scenario, the html output looks like:
image

And clicking on the red boxes:
image
And into the detail of each test clicking on it:
image

The integration is quite simple, I just `modified my local Makefile from this

$(GOTEST) $(GOTESTFLAGS) -v -count=1 $(UNIT_PACKAGES)

to this

$(GOTEST) $(GOTESTFLAGS) -v -count=1 -json $(UNIT_PACKAGES) | go-test-report

Same thing for the acceptance:

$(GOTEST) -v -count=1 -tags=acceptance -json -timeout=$(ACCEPTANCE_TIMEOUT) ./acceptance/... | go-test-report

it generates a test_report.html file.
Being a static webpage we can even think to serve it through our CI so we can always have a nice view of the tests status

@natalieparellano natalieparellano added the help wanted Need some extra hands to the this done. label Jul 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Need some extra hands to the this done. status/ready type/chore
Projects
None yet
Development

No branches or pull requests

3 participants