-
Notifications
You must be signed in to change notification settings - Fork 213
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add code coverage support #52
base: master
Are you sure you want to change the base?
Conversation
I must add that currently rules_kotlin itself is not supposed to pass code coverage, i.e. "cd rules_kotlin && bazel coverage //:all_tests" does not work. The reasons are:
However, I added some integration tests based on https://github.com/bazelbuild/bazel-integration-testing that check the generated LCOV for a simple case. There are probably many edge cases I have not considered, and things regarding, e.g. baseline coverage, that need to be taken into account, but this is what I have right now. |
@hchauvin great to see someone else getting into the codebase. I need help in maintaining this repo and designing advanced features. I assume the end goal of this PR is to test the builder layer directly ? Currently the builder layer is tested indirectly by passing positive test cases through it and validating the output. A motivating use case for testing the builder directly from the stuff that I have been working on might be #40. Currently #40 is implemented in a
A shortcoming with the approach I am currently taking is that we can't pass through negative test cases -- but negative cases aren't important enough currently. Can you test what you need to test without using |
Hi, Thank you for the interest! So the original purpose was to add code coverage support (via bazel coverage), and I used bazel_integration_testing to check that the LCOV data files are generated as expected. I could not find any open-source example of how third-party rules can plug in the code coverage system, nor any example of how to "meta-test" a Bazel rule, so this PR has been designed, I concede, in a rather experimental way. bazel_integration_testing has drawbacks, but I am able to have the whole integration tests run in under two minutes. Granted, I achieved that by using some tricks, but nothing out of the ordinary. Unless I use bazel_integration_testing, the only way to find out whether the LCOV files are generated to spec is to invoke bazel in a bash script, outside of the usual invocation of tests with 'bazel test //...'. This can only be done if the CI system allows it, and currently, unless I am mistaken, your Buildkite CI system does not. If I can use a bash script instead, I can greatly simplify this step, and I can save the bazel_integration_testing for another PR, if that is still something you are willing to consider. And yes, I haven't looked at the details, but PR #40 can definitely reuse a lot of the logic developed here for its e2e testing. It can be as simple as adding a new method to CoverageTest (and rename the whole test class to something like EndToEndTest?). |
@hsyed I hope it's not too much of a derail but when you say that bazel-integration-testing is really slow what exactly do you mean? we're using it in a few places and are constantly looking for feedback to improve it (if it's too much of a derail we can continue this in an issue over in bazel-integration-testing). Thanks! |
Hi, a quick update since I got curious and did some profiling. So on an Ubuntu box I got the integration test to pass in 31s, downloading time included (way faster than on my Mac, but my setting is typical of CI environments, so 31s is the actual testing time to expect in CI). Profiling shows that ~10s are spent downloading com_github_jetbrains_kotlin. That's definitely an area where things can be improved, and there are ways to pay the 10s only once, not every time the test is run (bazelbuild/bazel-integration-testing#63). Then you have some time spent initializing the toolchains. On Ubuntu, for java+cc, that's around 4-5s max. On Mac, that's around 10-15s mainly because of the toolchain for xcode. That is not an area were things are easily improved. Concerning the remaining ~15s, it does not seem to me out of the ordinary considering you have to compile everything with code coverage then run the tests. |
So I removed bazel-integration-testing and split the commit to bring clarity. |
@hchauvin cool ! it's a good idea really. at the moment it's only 30 seconds because the workspace contains no dependencies. This repo currently has barely any external deps, the kotlin compiler repo is only 30ish mb. When/if we add tests that include the intellij bazel plugin repo (400 mb to download the intellij platform) and android examples things really begin to add up. |
@ittaiz the bazel-in-bazel aspect is extremely slow for a local development workflow everything needs to be fetched and unpacked etc. |
Can you elaborate what you mean by everything?
Bazel binary themselves, unpacking of them, dependencies in scratch
workspace?
Asking so we can focus effort on shared pain
…On Sat, 28 Apr 2018 at 0:06 Hassan Syed ***@***.***> wrote:
@ittaiz <https://github.com/ittaiz> the bazel-in-bazel aspect is
extremely slow for a local development workflow everything needs to be
fetched and unpacked etc.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#52 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABUIFxYNjH7IpMiq3njxQdGEwkySKQlIks5ts4hBgaJpZM4TeimA>
.
|
One development concerning Jacoco is worth mentioning here: 0.8.0 released on January 3rd introduced filtering options. One contribution to follow in particular relates to filtering out methods generated by Kotlin, such as getters/setters: jacoco/jacoco#681. This makes coverage reports for Kotlin even more relevant. |
@hchauvin master just had a huge refactor. Most of it is directly relevant to the work you have done. You can now test the builder directly -- see |
@hchauvin Also the deps are now managed via bazel-deps. |
Thanks for for heads-up, your changes are very cool! I'm rewriting my patch. |
@hsyed So I completely reworked the PR and got rid of the bash integration test by getting a bit into the logic of the JacocoCoverageRunner. |
@hchauvin sorry for the delay. i've rebased the changes in the branch When I was merging the code i saw instrumentation matching .kt files in skylark, it should also match java files I think because mixed mode compile is possible ? . I think we need to pick up these deps from upstream Ideally. Have a look at this. I think instrumentation will end up being another an action that is runnable after preprocessing is done. |
Any news on that topic? It looks like most of the work is done? |
Builds on previous work (bazelbuild#52) to add coverage support to Kotlin targets in Bazel. So far, it's working (in basic form), but seems to omit major swaths of our code when I test it on a live codebase. I'm not sure why that happens yet.
Builds on previous work (bazelbuild#52) to add coverage support to Kotlin targets in Bazel. So far, it's working (in basic form), but seems to omit major swaths of our code when I test it on a live codebase. I'm not sure why that happens yet.
Builds on previous work (bazelbuild#52) to add coverage support to Kotlin targets in Bazel. So far, it's working (in basic form), but seems to omit major swaths of our code when I test it on a live codebase. I'm not sure why that happens yet.
Add an offline instrumentation step with Jacoco to the Kotlin builder,
complete with some changes related to hermeticity (allows end-to-end
testing). The behavior of the native Java rules is copied as much as
possible concerning code coverage.