-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update Test Plan in KEP template #3279
Update Test Plan in KEP template #3279
Conversation
keps/NNNN-kep-template/README.md
Outdated
@@ -452,6 +452,36 @@ You can take a look at one potential example of such test in: | |||
https://github.com/kubernetes/kubernetes/pull/97058/files#diff-7826f7adbc1996a05ab52e3f5f02429e94b68ce6bce0dc534d1be636154fded3R246-R282 | |||
--> | |||
|
|||
### Testing Quality |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm actually not 100% convinced this should be in PRR as opposed to other section.
But wanted to make some starting proposal to kick the discussion.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, seems like it belongs in ### Test Plan
. Can we ask more pointed questions there?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do you think about questions below?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As an Enhancements Owner, I agree that this belongs in the Test Plan section with the goal of asking more clarifying questions as opposed to the existing light guidance that is there now. For instance, most people seem to just reply (in that section in KEPs) something along the lines of:
- add unit and e2e tests
And that's it. So we're mostly starting from ground zero on setting expectations on what tests should include and scope of changes.
So that won't merge too fast :) |
f230107
to
76307b7
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some initial thoughts mostly agreeing that this goes into the Test Plan (I was going to propose changes there so you beat me to it!).
As such, let's try to be more specific in our expectations for what test coverage and passage rates should look like when moving from alpha->beta and beta ->ga.
Let's also try to get clarity on the shape of the testing needed. Does this require a new package to adequately test it? Is this just added to an existing suite?
Re: existing tests, we need to balance between raising awareness on the current state of a sigs tests vs putting the onus on a specific author to fix it. Asking about the current rate raises awareness (and I agree with having that memorialized in the KEP), but we will need to have a bigger policy as to whether that should block individual KEPs from merging. This almost seems like a pre-release audit of testing rates and discussion with the SIGs/leads as to whether or not they have plans to remediate (and what that timeline is).
keps/NNNN-kep-template/README.md
Outdated
We need to ensure that all existing areas of the code that will be touched by | ||
this enhancement are adequatly tested to mitigate the risk of regressions. | ||
---> | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would add a question re: e2e specifically asking if an existing test package already exists where tests would be added vs a new test package needs to be created. Along with clarifying whether the sig has adequate support/capacity to create and maintain these new tests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will add.
keps/NNNN-kep-template/README.md
Outdated
So even if those hit Alpha in such state, it won't be possible to target Beta | ||
graduation (and thus by-default enablement) until the testing is sufficient. | ||
--> | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Additionally, as one moves from Beta to GA what are our test expectations? Above you mention 80% for unit tests, what do we expect to see for e2e? In an ideal world, what would e2e on a feature look like to allow it to be targeted to GA? Let's be explicit here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd also like to see (as the KEP moves through stages) passage rates for the specific implemented tests (as opposed to the general passage rates) so that we can identify flaky tests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In an ideal world, what would e2e on a feature look like to allow it to be targeted to GA? Let's be explicit here.
I don't think it's possible to describe it generically here.
Or are you asking just for adding a question about this? If so that makes sense.
I'd also like to see (as the KEP moves through stages) passage rates for the specific implemented tests (as opposed to the general passage rates) so that we can identify flaky tests.
Good point - let me add that.
We definitely want the people who are proposing changes to be aware of the health of the area they are proposing changing. I'm on the fence about how much detail belongs in their KEP, since it sort of denormalizes component health info into active proposals, and is likely to go stale fast. "Is area X healthy enough to accept changes?" seems like info that belongs more at the component/subproject/sig level, and is the responsibility of the maintainers of that area to surface, and is a prereq for KEPs in that area.
"the relevant area is not healthy enough to accept the changes proposed in this KEP" is a completely legitimate reason to not accept otherwise acceptable changes. I expect approvers to be making those judgement calls already, and communicating clearly when that is an issue impacting an otherwise acceptable KEP. |
keps/NNNN-kep-template/README.md
Outdated
This section must be completed when targeting alpha to a release. | ||
--> | ||
|
||
###### What is the current (unit) test coverage of packages that will be touched when implementing this KEP? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this seems hard to answer ahead of implementation, and likely to become stale quickly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of these few questions, how about:
[ ] I understand the owners of the involved components may require updating existing tests to make this code solid enough to build on top of (everyone has to check yes)
and
[ ] statement from existing owners regarding pre-existing component test health (to be supplied by KEP reviewers)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this seems hard to answer ahead of implementation, and likely to become stale quickly
Thinking about it, I will tweak to the following:
- this part will be required only for Beta/GA and will have to report healthiness of packages that were touched - this means that you won't need to predict anything (but rather report) and will be a double check of both authors and reviewers did a reasonable job.
Instead of these few questions, how about:
I'm not against adding those, but I don't believe in those being enough. We need some mechanism to track it this too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The only thing I feel strongly about is to give the area owners the chance / responsibility to state the testing deficiencies of the existing code. I think having some human judgment in there is going to be better than picking a coverage target, for example. We can suggest a coverage target if the owners don't have a better idea about the deficiencies.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Which I mentioned in the template explicitly too.
As an approver you will be able to say that 20% coverage is fine for you if you want.
But the first thing to achieve is to improve visibility and awareness - many people don't realize how bad our coverage is in some areas.
Agree. Perhaps we can surface this as a pre-release testing survey (that I just made up) during the period of time between end of one release and beginning of another? The idea would be tease out if things are going well or if there are isolated portions of tests causing issues (some keps may go in not affecting those) or an overall shaky test suite (perhaps resulting in decision to not merge lower priority keps/minimize features going in/being promoted/etc...). |
keps/NNNN-kep-template/README.md
Outdated
<!-- | ||
This question should be filled when targeting Beta release. | ||
The goal is to ensure that we don't accept enhancements with inadequate testing. | ||
So even if those hit Alpha in such state, it won't be possible to target Beta |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMO everything should always be tested. alpha/beta/GA doesn't matter for this requirement.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(As a note, all KEPs are required to have a test plan regardless of stage, so +1 on lavalamps point above.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMO everything should always be tested. alpha/beta/GA doesn't matter for this requirement.
I agree with the principle. But the principle was always there but didn't work very well in practice.
So I would like have a way to ensure that we won't proceed to Beta if shortcuts were made in Alpha.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If the principle is clearly there already, and it's not being followed, then there's likely no modification to this template that will fix that, since the problem is enforcement and not the instructions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If the principle is clearly there already, and it's not being followed, then there's likely no modification to this template that will fix that, since the problem is enforcement and not the instructions.
I don't fully agree. Because I can also imagine that being something not fully conscious.
If you're forced to fill in (or review) the template here for Beta graduation, this will force you to think about this.
53b44a1
to
0e31c99
Compare
@liggitt @lavalamp @kikisdeliveryservice - thanks for the feedback - PTAL |
- <package>: <current test coverage> | ||
|
||
The data can be easily read from: | ||
https://testgrid.k8s.io/sig-testing-canaries#ci-kubernetes-coverage-unit |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
note: we should work on automating this, I think you've already opened an issue about it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes - I already opened an issues for that. But at least for some time, we won't enforce specific targets, because our coverage is still way too low in multiple places.
+1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I left some suggestions for clarity but think this provides nice clarification and further guidance for the KEP Test Plan section.
keps/NNNN-kep-template/README.md
Outdated
@@ -270,6 +266,55 @@ when drafting this test plan. | |||
[testing-guidelines]: https://git.k8s.io/community/contributors/devel/sig-testing/testing.md | |||
--> | |||
|
|||
[ ] I/we understand the owners of the involved components may require updating |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[ ] I/we understand the owners of the involved components may require updating | |
[ ] I/we understand, as owners, that the involved components may require updates to |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't fully changed that because I think the change wasn't reflecting what I wanted.
I wanted to say that "component OWNERS" (say approvers) have a right to request additional tests before proceeding. Does that make sense?
keps/NNNN-kep-template/README.md
Outdated
@@ -270,6 +266,55 @@ when drafting this test plan. | |||
[testing-guidelines]: https://git.k8s.io/community/contributors/devel/sig-testing/testing.md | |||
--> | |||
|
|||
[ ] I/we understand the owners of the involved components may require updating | |||
existing tests to make this code solid enough prior committing changes necessary |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
existing tests to make this code solid enough prior committing changes necessary | |
existing tests to make this code solid enough prior to committing the changes necessary |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
keps/NNNN-kep-template/README.md
Outdated
<!-- | ||
In principle every added code should be unit tested. However, the exact tests are hard | ||
to answer ahead of implementation. As a result, this section should only be filled when | ||
targeting Beta to ensure that something wasn't missed during Alpha implementation. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we want any updates to this for GA or just beta?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it should be no-op for GA, but it doesn't hurt to add it - added.
@mrbobbytables @jeremyrickard PTAL. I believe this provides some clarifications as to what we expect to see in the Test Plan (which many times have thin details). The changes focus on stability and shouldn't be burdensome and will provide the author and sig with better insight into the stability of the features and components. |
I think this might be a tad heavyhanded, but we should try and see if it works. /lgtm |
keps/NNNN-kep-template/README.md
Outdated
Talking about individual tests would be an overkill, so we just require to list the | ||
packages that were touched during the implementation together with their current test | ||
coverage in the form: | ||
- <package>: <current test coverage> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in the KEP + beta stage still seems like the wrong time/place for this information to me; it denormalizes information that will be instantly stale into KEPs after changes were already accepted into the tree
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the "I acknowledge improvements to existing code may be required" bit above seems like a good mental prompt to me
for unit testing, I would reframe this as "all new code is expected to have complete unit test coverage. if that is not possible, explain why here and explain why that is acceptable"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm definitely happy to add what you suggested above.
But I don't really agree that the coverage information is a wrong time/place for this information.
If you're worried about staleness - let me change that to:
<package>: <date> - <coverage>
This will be definition not go stale. And having the information about the coverage will expose it to people just reading KEPs and not watching to individual PRs to see how it goes.
My main reason for ensuring it's put here is that KEP approvers are in many cases not approvers for the code itself (or at least not all the code). So putting it here verbatim will show it very clearly into their eyes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1 here. I think the stats here are going to be stale too quickly and there isn't going to be a lot of value here for alpha/beta KEP states.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to be clear about the timing aspect... I think pre-alpha/alpha is exactly the right time to be asking this question, since that's when code starts merging and maybe breaking existing undertested stuff. I just think asking the KEP author to collate it into the KEP isn't a great mechanism.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to be clear about the timing aspect... I think pre-alpha/alpha is exactly the right time to be asking this question, since that's when code starts merging and maybe breaking existing undertested stuff
OK - so I'm all for asking that for Alpha (if you remember that was actually my initial proposal).
But your couterargument is that the exact set of packages will sometimes be hard to predict.
So before Alpha is the compromise because:
(a) a significant part of implementation is already done at this point
(b) it still isn't enabled by default, which means that enhancement owner still have big motivation to move it to Beta
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I discussed this with Jordan offline and updated to some in-between point.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the updated unit test bullet is a reasonable level of info to include
aside from the request for package unit test coverage stats, these additions lgtm |
Thanks for all the thoughtful comments and reviews on this everyone! This generally looks good to me, although I agree with @liggitt's point above regarding package level unit test coverage stats. I think we should amend the content there, but maybe it's sufficient to just say the stats should be filled out for promotion to stable? |
edc5d69
to
a113d0c
Compare
a113d0c
to
a86942e
Compare
|
||
<!-- | ||
Based on reviewers feedback describe what additional tests need to be added prior | ||
implementing this enhancement to ensure the enhancements have also solid foundations. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: prior to implementing
/lgtm |
The updated wording seems good and discussion above seem good to me. /approve |
Agree with the updates since last time I reviewed. After this merges, I'll send out an email to k-dev/sig leads to note the Test Plan clarifications/changes. We can also let the new RT (once it forms) know about the change and to keep an eye out on the enhancements as they review them. I think we'll definitely get more robust Test Plans thanks to this PR :) /approve |
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: jeremyrickard, kikisdeliveryservice, wojtek-t The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
Thank you all for all the comments - I really hope it will help with our testing quality. I'm going to hold cancel to let it merge to ensure it's there before 1.25 release starts. /hold cancel |
Some feedback from filling in these sections for #3310 (comment)
|
Ref #3138
There is a lot of context for this change across the whole #3139
For PRR
/assign @johnbelamaric @deads2k @ehashman
People involved in the discussion
/assign @liggitt @lavalamp @aojea @jberkus
@kubernetes/enhancements