Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integration Tests #2

Open
georgehrke opened this issue Jan 13, 2018 · 2 comments
Open

Integration Tests #2

georgehrke opened this issue Jan 13, 2018 · 2 comments
Assignees
Labels
enhancement New feature or request

Comments

@georgehrke
Copy link
Member

I'd like to have integration tests that test this library against the real Nextcloud CalDAV and CardDAV server.

Any suggestions for an integration test framework?

@rullzer
Copy link
Member

rullzer commented Jan 13, 2018

Summoning @danxuliu for input

@danxuliu
Copy link
Member

I am afraid that I can not be of much help here, as I have never used integration test frameworks on JavaScript. Anyway, here is some information just in case you find it useful (although I am sure you already know everything that I will write below ;-) ).

The integration test framework used in the server is Behat, which is the PHP implementation of Cucumber. Cucumber is a Behaviour-Driven Development framework with implementations in several languages, including JavaScript, so you may want to use it to keep a consistent style in the test definitions between the server and this library.

However, Cucumber basically provides a translation between the human-readable Gherkin language (Given, When and Then sentences) and the programming language in which the tests really run. As far as I know it does not provide anything regarding integration test themselves, in the sense of starting the server before a test and cleaning it up after the test has run. In my experience that (test isolation) is precisely the most complex part regarding integration (and acceptance) tests, and I am not aware of any tool or framework that takes care of that automatically :-(

In the integration tests of the server the clean up is done explicitly; there are several methods annotated as @AfterScenario and @AfterSuite that acts as an undo of their associated tests. The acceptance tests use a different approach; after each test is run the server is stopped and then started again in its initial state.

The advantage of fully resetting the server is that each test is fully isolated. If an integration test fails and leaves the server in an unexpected state not covered by the clean up method further tests could fail in cascade. More important is that it requires more planning and maintenance to write and then keep the tests and the clean up in sync. However, the drawback of resetting the server is that the tests are slower to run and, more important, that currently that approach works when using SQLite, but not when the data is stored in real databases.

Initially, the acceptance tests run in docker containers, which made trivial and quick to reset them to a known state. The problem with that approach is that they could not be run in Drone, as Drone tasks run in Docker containers, and it would require nested containers, which due to the current architecture of Drone and Docker would mean that anyone making a pull request to our repository could get root access to the machine they run on without much effort (great, isn't it? :-P ). Thus now the acceptance tests run in the built-in PHP server using SQLite and a dirty trick is used to reset them: adding all the files to a local Git repository and checking it again to its initial state after each test is run. Unfortunately that trick limits them to the SQLite database :-(

End of the verbose yet barely useful explanation :-P

@georgehrke georgehrke added this to the 0.0.2 milestone Aug 18, 2018
@georgehrke georgehrke self-assigned this Oct 3, 2018
@georgehrke georgehrke added the enhancement New feature or request label Oct 3, 2018
@georgehrke georgehrke removed this from the 0.0.2 milestone Nov 10, 2018
@georgehrke georgehrke modified the milestone: 0.0.3 Jan 31, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants