Skip to content
This repository has been archived by the owner on Jan 18, 2021. It is now read-only.

[Tests-Only] Add tests/acceptance/expected-failures.txt #394

Closed
wants to merge 3 commits into from

Conversation

phil-davis
Copy link
Contributor

@phil-davis phil-davis commented Jul 23, 2020

Description

This is a demonstration of how we can use the EXPECTED_FAILURES_FILE feature of the core acceptance test runner.
That feature is waiting in core PR owncloud/core#37717

core branch adjust-skip-tags-expected-failures has many of the skipOnOcis tags removed from scenarios that we expect to pass on OCIS one day, but do not pass yet. (Actually for now I changed the tag to wasSkipOnOcis just so that I can find them easily). See owncloud/core#37725 - So a lot more API acceptance test scenarios run in the CI here. That increases the elapsed run-time of the tests in CI, but it means that a developer gets to know if a different set of scenarios fail.

699 scenarios (499 passed, 200 failed)
5421 steps (4591 passed, 200 failed, 630 skipped)
9m33.41s (22.39Mb)
Checking expected failures
Success - all failures were expected

tests/acceptance/expected-failures.txt has a list of the scenarios that are currently expected to fail. The test runner checks the actual failures against the expected failures and reports any differences. In this example the 200 scenarios that fail are exactly the 200 scenarios listed in EXPECTED_FAILURES_FILE

If a developer improves the OCIS code, and more core API acceptance tests are passing, then the developer just deletes those scenario numbers from tests/acceptance/expected-failures.txt. There is no need to find the scenarios in core and unskip them there.

To demonstrate bugs/different behavior in OCIS, the scenarios that demonstrate the "bad behavior" can be put here in the ocis-reva repo. See the 3rd commit. An extra pipeline step has been added to run these local acceptance test scenarios.

When a developer fixes a problem that has a local acceptance test that demonstrates the old behavior, they would delete that test scenario and delete the now-passing core acceptance test from the EXPECTED_FAILURES_FILE, since OCIS will now conform more to the core API acceptance tests.

The concept/principle is that the core API acceptance tests are the "defined standard" for the API. Those can be improved to provide better coverage or corrected if they are not correct, but they should not need to be changed often. As OCIS develops, the deviation from this "defined standard" can be managed in the OCIS repo via EXPECTED_FAILURES_FILE and supplementary local acceptance test scenarios.

Related Issues

#282 Split old public API webdav tests from new public webdav tests
we probably will not need to do anything in core - we can just list the known fails in EXPECTED_FAILURES_FILE

#329 Find untagged tests that need adjusting for OCIS (and the comment at #329 (comment) )
we can control all the skip-tagging that currently causes the real day-to-day pain by using EXPECTED_FAILURES_FILE and local acceptance tests in OCIS repos that demonstrate different behavior

@phil-davis phil-davis force-pushed the expected-failures-demo-20200723 branch 2 times, most recently from fcc41cf to 05c444d Compare July 23, 2020 13:06
@phil-davis phil-davis force-pushed the expected-failures-demo-20200723 branch from 05c444d to 8fd6a49 Compare July 23, 2020 13:20
@phil-davis
Copy link
Contributor Author

phil-davis commented Jul 23, 2020

In the 3rd commit to allow running local API acceptance tests I have added the needed test infrastructure. Because the test code from core is written in PHP-Behat-Gherkin that means this ocis-reva repo gets the needed PHP infrastructure - composer.json, extra entries in .gitignore, extra targets in Makefile, vendor-bin with Behat dependencies, and the stuff in tests/acceptance

It is all test tooling, and does not impact the go application code. But it does look like it "pollutes" this repo a bit. So someone can decide if we should put effort into trying to minimize this - e.g. just have local feature files, and somehow get the core Behat tooling infrastructure to be able to parse and execute the feature files.

Whatever tooling we end up with, we would do similar in cs3org/reva and owncloud/ocis

@butonic
Copy link
Member

butonic commented Jul 24, 2020

@phil-davis this looks great! regarding polluting this repo: there was a discussion of merging ocis-ocs, ocis-webdav, the fileviewer component from phoenix and the url signing stuff into an ocis-files extension. even the ms graph drive and drive item resources could move there. It would mean that the webdav and ocs tests also move there. ocis-reva would no longer need those tests because it is just a more ocis way of starting reva and setting the defaults ... at least that is the direction we have in mind.

I don't know when we will get to that though ...

@phil-davis
Copy link
Contributor Author

I don't know when we will get to that though

It will be "easy enough" to migrate the acceptance test tooling to different repo(s). Once you have an example like this, it should be just copy-paste :)

@phil-davis phil-davis closed this Jul 29, 2020
@phil-davis phil-davis deleted the expected-failures-demo-20200723 branch July 29, 2020 10:48
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants