Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Test Plan in KEP template #3279

Merged
merged 1 commit into from
Apr 27, 2022

Conversation

wojtek-t
Copy link
Member

Ref #3138

There is a lot of context for this change across the whole #3139

For PRR
/assign @johnbelamaric @deads2k @ehashman

People involved in the discussion
/assign @liggitt @lavalamp @aojea @jberkus

@kubernetes/enhancements

@k8s-ci-robot k8s-ci-robot added cncf-cla: yes Indicates the PR's author has signed the CNCF CLA. size/M Denotes a PR that changes 30-99 lines, ignoring generated files. kind/kep Categorizes KEP tracking issues and PRs modifying the KEP directory labels Apr 14, 2022
@@ -452,6 +452,36 @@ You can take a look at one potential example of such test in:
https://github.com/kubernetes/kubernetes/pull/97058/files#diff-7826f7adbc1996a05ab52e3f5f02429e94b68ce6bce0dc534d1be636154fded3R246-R282
-->

### Testing Quality
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm actually not 100% convinced this should be in PRR as opposed to other section.
But wanted to make some starting proposal to kick the discussion.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, seems like it belongs in ### Test Plan. Can we ask more pointed questions there?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you think about questions below?

Copy link
Member

@kikisdeliveryservice kikisdeliveryservice Apr 14, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As an Enhancements Owner, I agree that this belongs in the Test Plan section with the goal of asking more clarifying questions as opposed to the existing light guidance that is there now. For instance, most people seem to just reply (in that section in KEPs) something along the lines of:

  • add unit and e2e tests

And that's it. So we're mostly starting from ground zero on setting expectations on what tests should include and scope of changes.

@wojtek-t
Copy link
Member Author

So that won't merge too fast :)
/hold

@k8s-ci-robot k8s-ci-robot added the do-not-merge/hold Indicates that a PR should not merge because someone has issued a /hold command. label Apr 14, 2022
@wojtek-t wojtek-t force-pushed the better_testing_in_kep branch from f230107 to 76307b7 Compare April 14, 2022 13:54
Copy link
Member

@kikisdeliveryservice kikisdeliveryservice left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some initial thoughts mostly agreeing that this goes into the Test Plan (I was going to propose changes there so you beat me to it!).

As such, let's try to be more specific in our expectations for what test coverage and passage rates should look like when moving from alpha->beta and beta ->ga.

Let's also try to get clarity on the shape of the testing needed. Does this require a new package to adequately test it? Is this just added to an existing suite?

Re: existing tests, we need to balance between raising awareness on the current state of a sigs tests vs putting the onus on a specific author to fix it. Asking about the current rate raises awareness (and I agree with having that memorialized in the KEP), but we will need to have a bigger policy as to whether that should block individual KEPs from merging. This almost seems like a pre-release audit of testing rates and discussion with the SIGs/leads as to whether or not they have plans to remediate (and what that timeline is).

keps/NNNN-kep-template/README.md Outdated Show resolved Hide resolved
keps/NNNN-kep-template/README.md Outdated Show resolved Hide resolved
We need to ensure that all existing areas of the code that will be touched by
this enhancement are adequatly tested to mitigate the risk of regressions.
--->

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would add a question re: e2e specifically asking if an existing test package already exists where tests would be added vs a new test package needs to be created. Along with clarifying whether the sig has adequate support/capacity to create and maintain these new tests.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will add.

So even if those hit Alpha in such state, it won't be possible to target Beta
graduation (and thus by-default enablement) until the testing is sufficient.
-->

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Additionally, as one moves from Beta to GA what are our test expectations? Above you mention 80% for unit tests, what do we expect to see for e2e? In an ideal world, what would e2e on a feature look like to allow it to be targeted to GA? Let's be explicit here.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd also like to see (as the KEP moves through stages) passage rates for the specific implemented tests (as opposed to the general passage rates) so that we can identify flaky tests.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In an ideal world, what would e2e on a feature look like to allow it to be targeted to GA? Let's be explicit here.

I don't think it's possible to describe it generically here.
Or are you asking just for adding a question about this? If so that makes sense.

I'd also like to see (as the KEP moves through stages) passage rates for the specific implemented tests (as opposed to the general passage rates) so that we can identify flaky tests.

Good point - let me add that.

@kikisdeliveryservice kikisdeliveryservice self-assigned this Apr 14, 2022
@liggitt
Copy link
Member

liggitt commented Apr 14, 2022

Re: existing tests, we need to balance between raising awareness on the current state of a sigs tests vs putting the onus on a specific author to fix it. Asking about the current rate raises awareness (and I agree with having that memorialized in the KEP)

We definitely want the people who are proposing changes to be aware of the health of the area they are proposing changing. I'm on the fence about how much detail belongs in their KEP, since it sort of denormalizes component health info into active proposals, and is likely to go stale fast.

"Is area X healthy enough to accept changes?" seems like info that belongs more at the component/subproject/sig level, and is the responsibility of the maintainers of that area to surface, and is a prereq for KEPs in that area.

we will need to have a bigger policy as to whether that should block individual KEPs from merging.

"the relevant area is not healthy enough to accept the changes proposed in this KEP" is a completely legitimate reason to not accept otherwise acceptable changes. I expect approvers to be making those judgement calls already, and communicating clearly when that is an issue impacting an otherwise acceptable KEP.

This section must be completed when targeting alpha to a release.
-->

###### What is the current (unit) test coverage of packages that will be touched when implementing this KEP?
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this seems hard to answer ahead of implementation, and likely to become stale quickly

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of these few questions, how about:

[ ] I understand the owners of the involved components may require updating existing tests to make this code solid enough to build on top of (everyone has to check yes)

and

[ ] statement from existing owners regarding pre-existing component test health (to be supplied by KEP reviewers)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this seems hard to answer ahead of implementation, and likely to become stale quickly

Thinking about it, I will tweak to the following:

  • this part will be required only for Beta/GA and will have to report healthiness of packages that were touched - this means that you won't need to predict anything (but rather report) and will be a double check of both authors and reviewers did a reasonable job.

Instead of these few questions, how about:

I'm not against adding those, but I don't believe in those being enough. We need some mechanism to track it this too.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The only thing I feel strongly about is to give the area owners the chance / responsibility to state the testing deficiencies of the existing code. I think having some human judgment in there is going to be better than picking a coverage target, for example. We can suggest a coverage target if the owners don't have a better idea about the deficiencies.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Which I mentioned in the template explicitly too.

As an approver you will be able to say that 20% coverage is fine for you if you want.

But the first thing to achieve is to improve visibility and awareness - many people don't realize how bad our coverage is in some areas.

@kikisdeliveryservice
Copy link
Member

kikisdeliveryservice commented Apr 14, 2022

"Is area X healthy enough to accept changes?" seems like info that belongs more at the component/subproject/sig level, and is the responsibility of the maintainers of that area to surface, and is a prereq for KEPs in that area.

Agree. Perhaps we can surface this as a pre-release testing survey (that I just made up) during the period of time between end of one release and beginning of another? The idea would be tease out if things are going well or if there are isolated portions of tests causing issues (some keps may go in not affecting those) or an overall shaky test suite (perhaps resulting in decision to not merge lower priority keps/minimize features going in/being promoted/etc...).

<!--
This question should be filled when targeting Beta release.
The goal is to ensure that we don't accept enhancements with inadequate testing.
So even if those hit Alpha in such state, it won't be possible to target Beta
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO everything should always be tested. alpha/beta/GA doesn't matter for this requirement.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(As a note, all KEPs are required to have a test plan regardless of stage, so +1 on lavalamps point above.)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO everything should always be tested. alpha/beta/GA doesn't matter for this requirement.

I agree with the principle. But the principle was always there but didn't work very well in practice.
So I would like have a way to ensure that we won't proceed to Beta if shortcuts were made in Alpha.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the principle is clearly there already, and it's not being followed, then there's likely no modification to this template that will fix that, since the problem is enforcement and not the instructions.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the principle is clearly there already, and it's not being followed, then there's likely no modification to this template that will fix that, since the problem is enforcement and not the instructions.

I don't fully agree. Because I can also imagine that being something not fully conscious.
If you're forced to fill in (or review) the template here for Beta graduation, this will force you to think about this.

@wojtek-t wojtek-t force-pushed the better_testing_in_kep branch 2 times, most recently from 53b44a1 to 0e31c99 Compare April 15, 2022 06:33
@wojtek-t
Copy link
Member Author

@liggitt @lavalamp @kikisdeliveryservice - thanks for the feedback - PTAL

- <package>: <current test coverage>

The data can be easily read from:
https://testgrid.k8s.io/sig-testing-canaries#ci-kubernetes-coverage-unit
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note: we should work on automating this, I think you've already opened an issue about it

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes - I already opened an issues for that. But at least for some time, we won't enforce specific targets, because our coverage is still way too low in multiple places.

@aojea
Copy link
Member

aojea commented Apr 18, 2022

+1

Copy link
Member

@kikisdeliveryservice kikisdeliveryservice left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I left some suggestions for clarity but think this provides nice clarification and further guidance for the KEP Test Plan section.

@@ -270,6 +266,55 @@ when drafting this test plan.
[testing-guidelines]: https://git.k8s.io/community/contributors/devel/sig-testing/testing.md
-->

[ ] I/we understand the owners of the involved components may require updating

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
[ ] I/we understand the owners of the involved components may require updating
[ ] I/we understand, as owners, that the involved components may require updates to

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't fully changed that because I think the change wasn't reflecting what I wanted.

I wanted to say that "component OWNERS" (say approvers) have a right to request additional tests before proceeding. Does that make sense?

@@ -270,6 +266,55 @@ when drafting this test plan.
[testing-guidelines]: https://git.k8s.io/community/contributors/devel/sig-testing/testing.md
-->

[ ] I/we understand the owners of the involved components may require updating
existing tests to make this code solid enough prior committing changes necessary

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
existing tests to make this code solid enough prior committing changes necessary
existing tests to make this code solid enough prior to committing the changes necessary

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

keps/NNNN-kep-template/README.md Show resolved Hide resolved
keps/NNNN-kep-template/README.md Show resolved Hide resolved
keps/NNNN-kep-template/README.md Show resolved Hide resolved
<!--
In principle every added code should be unit tested. However, the exact tests are hard
to answer ahead of implementation. As a result, this section should only be filled when
targeting Beta to ensure that something wasn't missed during Alpha implementation.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want any updates to this for GA or just beta?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it should be no-op for GA, but it doesn't hurt to add it - added.

@kikisdeliveryservice
Copy link
Member

@mrbobbytables @jeremyrickard PTAL. I believe this provides some clarifications as to what we expect to see in the Test Plan (which many times have thin details). The changes focus on stability and shouldn't be burdensome and will provide the author and sig with better insight into the stability of the features and components.

@lavalamp
Copy link
Member

I think this might be a tad heavyhanded, but we should try and see if it works.

/lgtm

@k8s-ci-robot k8s-ci-robot added the lgtm "Looks good to me", indicates that a PR is ready to be merged. label Apr 22, 2022
Comment on lines 288 to 291
Talking about individual tests would be an overkill, so we just require to list the
packages that were touched during the implementation together with their current test
coverage in the form:
- <package>: <current test coverage>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in the KEP + beta stage still seems like the wrong time/place for this information to me; it denormalizes information that will be instantly stale into KEPs after changes were already accepted into the tree

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the "I acknowledge improvements to existing code may be required" bit above seems like a good mental prompt to me

for unit testing, I would reframe this as "all new code is expected to have complete unit test coverage. if that is not possible, explain why here and explain why that is acceptable"

Copy link
Member Author

@wojtek-t wojtek-t Apr 25, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm definitely happy to add what you suggested above.

But I don't really agree that the coverage information is a wrong time/place for this information.
If you're worried about staleness - let me change that to:

  • <package>: <date> - <coverage>

This will be definition not go stale. And having the information about the coverage will expose it to people just reading KEPs and not watching to individual PRs to see how it goes.

My main reason for ensuring it's put here is that KEP approvers are in many cases not approvers for the code itself (or at least not all the code). So putting it here verbatim will show it very clearly into their eyes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1 here. I think the stats here are going to be stale too quickly and there isn't going to be a lot of value here for alpha/beta KEP states.

Copy link
Member

@liggitt liggitt Apr 25, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to be clear about the timing aspect... I think pre-alpha/alpha is exactly the right time to be asking this question, since that's when code starts merging and maybe breaking existing undertested stuff. I just think asking the KEP author to collate it into the KEP isn't a great mechanism.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to be clear about the timing aspect... I think pre-alpha/alpha is exactly the right time to be asking this question, since that's when code starts merging and maybe breaking existing undertested stuff

OK - so I'm all for asking that for Alpha (if you remember that was actually my initial proposal).
But your couterargument is that the exact set of packages will sometimes be hard to predict.

So before Alpha is the compromise because:
(a) a significant part of implementation is already done at this point
(b) it still isn't enabled by default, which means that enhancement owner still have big motivation to move it to Beta

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I discussed this with Jordan offline and updated to some in-between point.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the updated unit test bullet is a reasonable level of info to include

@liggitt
Copy link
Member

liggitt commented Apr 25, 2022

aside from the request for package unit test coverage stats, these additions lgtm

@jeremyrickard
Copy link
Contributor

Thanks for all the thoughtful comments and reviews on this everyone! This generally looks good to me, although I agree with @liggitt's point above regarding package level unit test coverage stats. I think we should amend the content there, but maybe it's sufficient to just say the stats should be filled out for promotion to stable?

@wojtek-t wojtek-t force-pushed the better_testing_in_kep branch from edc5d69 to a113d0c Compare April 26, 2022 15:05
@k8s-ci-robot k8s-ci-robot removed the lgtm "Looks good to me", indicates that a PR is ready to be merged. label Apr 26, 2022
@wojtek-t wojtek-t force-pushed the better_testing_in_kep branch from a113d0c to a86942e Compare April 26, 2022 15:07

<!--
Based on reviewers feedback describe what additional tests need to be added prior
implementing this enhancement to ensure the enhancements have also solid foundations.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: prior to implementing

@lavalamp
Copy link
Member

/lgtm

@k8s-ci-robot k8s-ci-robot added the lgtm "Looks good to me", indicates that a PR is ready to be merged. label Apr 26, 2022
@jeremyrickard
Copy link
Contributor

The updated wording seems good and discussion above seem good to me.

/approve

@k8s-ci-robot k8s-ci-robot added the approved Indicates a PR has been approved by an approver from all required OWNERS files. label Apr 26, 2022
@kikisdeliveryservice
Copy link
Member

Agree with the updates since last time I reviewed. After this merges, I'll send out an email to k-dev/sig leads to note the Test Plan clarifications/changes. We can also let the new RT (once it forms) know about the change and to keep an eye out on the enhancements as they review them.

I think we'll definitely get more robust Test Plans thanks to this PR :)

/approve

@k8s-ci-robot
Copy link
Contributor

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: jeremyrickard, kikisdeliveryservice, wojtek-t

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@wojtek-t
Copy link
Member Author

Thank you all for all the comments - I really hope it will help with our testing quality.
If we realize in a release or so that something can be done differently/better, we can always change it, but for now I don't think we see those opportunities.

I'm going to hold cancel to let it merge to ensure it's there before 1.25 release starts.

/hold cancel

@k8s-ci-robot k8s-ci-robot removed the do-not-merge/hold Indicates that a PR should not merge because someone has issued a /hold command. label Apr 27, 2022
@k8s-ci-robot k8s-ci-robot merged commit 278a316 into kubernetes:master Apr 27, 2022
@k8s-ci-robot k8s-ci-robot added this to the v1.25 milestone Apr 27, 2022
@tallclair
Copy link
Member

tallclair commented May 12, 2022

Some feedback from filling in these sections for #3310 (comment)

  • The prerequisites comment could use a bit more guidance, and maybe an example.
  • Generating the test coverage was painful. The comment says "The data can be easily read from: https://testgrid.k8s.io/sig-testing-canaries#ci-kubernetes-coverage-unit", but that was slow to load, and then crashed my browser (resolved with using the regex filter). Then, coverage cannot easily be copied out due to how the table selection works. If we keep this, I'd like to see a script that can generate the report in the desired format, ex: hack/generate-coverage-report.sh staging/src/k8s.io/pod-security-admission/...
  • "In principle every added code should have complete unit test coverage, so providing
    the exact set of tests will not bring additional value." I don't know what the second half of this sentence means. Regarding complete unit test coverage, I disagree. See 7f5d72d for examples of cases that shouldn't have unit tests.
  • integration tests: took me some time to figure out how to find the PodSecurity integration tests (section would benefit from an example).
  • Integration & e2e sections ask for test coverage - how do you get test coverage from integration and e2e tests?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
approved Indicates a PR has been approved by an approver from all required OWNERS files. cncf-cla: yes Indicates the PR's author has signed the CNCF CLA. kind/kep Categorizes KEP tracking issues and PRs modifying the KEP directory lgtm "Looks good to me", indicates that a PR is ready to be merged. size/M Denotes a PR that changes 30-99 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.