Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Measure code coverage in CI and use as a merge gate #1049

Open
bleggett opened this issue May 11, 2024 · 7 comments
Open

Measure code coverage in CI and use as a merge gate #1049

bleggett opened this issue May 11, 2024 · 7 comments
Assignees

Comments

@bleggett
Copy link
Contributor

bleggett commented May 11, 2024

Code coverage for ztunnel was audited at a point-in-time: #16

but we don't have a good ongoing sense of what chunks have poor unit test coverage, and we don't have a good ongoing sense of where adding more cheap unit tests would bring the most value, and we generally want to encourage more, cheaper, easier to debug/maintain unit tests, and fewer, expensive, harder to debug/maintain integ/e2e tests.

We should generate code coverage stats (per branch/per func) as part of PR CI, and compare it with the current high-watermark in mainline, and warn/fail the CI check if coverage drops below the current high-watermark.

(usual caveat - code coverage is not the only proof of good code coverage or useful tests, but it is the easiest to automate, and hard to casually cheat)

@daixiang0
Copy link
Member

Like codecov?

@keithmattix
Copy link
Contributor

keithmattix commented May 13, 2024

+1 - I would like to see something like this across Istio tbh

@howardjohn
Copy link
Member

My 2c:

  • -1 on a CI failure based on code coverage
  • No on codecov.io due to security concerns
  • +1 to making it easy, as a developer, to tell the coverage of the code at a point in time

@bleggett
Copy link
Contributor Author

bleggett commented May 13, 2024

My 2c:

* -1 on a CI failure based on code coverage

Reasoning? No CI check == license to ignore, most of the time, practically speaking. A hard check would probably be less noisy and far more practical use than the current "go test nag".

Note that I'm not saying that the coverage watermark % couldn't be stored in a makefile and bumped up or down in a PR, but at least that way people have to make a case for it and it has to be visible/approved - we know the coverage has gone up or down, and there's a public record of why.

* No on codecov.io due to security concerns

👍

When developing the inpod CNI internally we used overcover which is a very basic/simple thing that just parses go test coverage reports. It worked well enough and doesn't require any magic. Can always dump the coverage report as an artifact if people wanna see it, it doesn't need to be fancy.

overcover --coverprofile cover.out ./... --threshold $(COVERAGE_THRESH_PCT)

tl;dr: I really don't want to use anything that can't work locally AND as a simple makefile target.

* +1 to making it easy, as a developer, to tell the coverage of the code at a point in time

We do need that, but connecting that up with daily practice in the CI versus the honor system is usually simpler and more consistent. If it's something a reviewer might ask a PR author to copypaste, we should just check it implicitly in CI.

@keithmattix
Copy link
Contributor

At one point we had coverage on testgrid; can we just do that again as a first step?

@bleggett
Copy link
Contributor Author

bleggett commented May 14, 2024

rustc has code coverage support, I haven't used it, but I would be fine with using that to add coverage to test runs and dumping the coverage report raw as a CI artifact for now.

@Stevenjin8 Stevenjin8 self-assigned this Aug 23, 2024
@istio-policy-bot istio-policy-bot added the lifecycle/stale Indicates a PR or issue hasn't been manipulated by an Istio team member for a while label Aug 23, 2024
@Stevenjin8
Copy link
Contributor

no stale

@istio-policy-bot istio-policy-bot removed the lifecycle/stale Indicates a PR or issue hasn't been manipulated by an Istio team member for a while label Aug 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants