Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a travis file, prepare to run tests on travis-ci.org #886

Closed
wants to merge 11 commits into from

Conversation

lucc
Copy link
Collaborator

@lucc lucc commented Jul 17, 2016

From the existence of https://github.com/pazz/alot/tree/0.3.8-unittests I take it that it is also planned to run these test somewhere. So I suggest to use travis and start with this travis file. I already set up travis for my fork of alot so you can see the build for this PR here: https://travis-ci.org/lucc/alot/branches

I initially did some more tests with travis wich can be found in my fork (branches travis-orig and travis-multi-os).

@lucc
Copy link
Collaborator Author

lucc commented Jul 17, 2016

I was hoping to do something more thorough in the last commit but I could not think of any way to start alot and quit it immediatly afterwards (like vim -c quit for example). Setting initial_command=exit results in a blank screen. Only after I press q alot quits. This is the corresponding log:

DEBUG:manager:/home/luc/.config/alot/themes
INFO:ui:setup gui in 256 colours
DEBUG:ui:fire first command
DEBUG:ui:global command string: "exit"
DEBUG:__init__:mode:global got commandline "exit"
DEBUG:__init__:ARGS: [u'exit']
DEBUG:__init__:cmd parms {}
ERROR:ui:Could not stop reactor: Can't stop reactor that isn't running..
Shutting down anyway..
DEBUG:ui:Got key (['q'], [113])
DEBUG:ui:cmdline: 'exit'
DEBUG:ui:global command string: "exit"
DEBUG:__init__:mode:global got commandline "exit"
DEBUG:__init__:ARGS: [u'exit']
DEBUG:__init__:cmd parms {}

@lucc
Copy link
Collaborator Author

lucc commented Jul 18, 2016

I found a way to exit alot immediately, will clean up the git history next.

@pazz
Copy link
Owner

pazz commented Jul 19, 2016

cool! I have no idea how to use this yet but feel free to integrate the unitttest branch. I'm not working on this atm.

@lucc
Copy link
Collaborator Author

lucc commented Jul 19, 2016

In order to use this you have to got to http://travis-ci.org, sign in with you Github account and give travis access to this repository and configure it to run on pull requests. Then you should see the result here next to the merge-conflict-indicator.

If you don't say otherwise I will just cherry-pick your unittests.

@pazz
Copy link
Owner

pazz commented Jul 20, 2016

Quoting Lucas Hoffmann (2016-07-20 00:26:17)

In order to use this you have to got to [1]http://travis-ci.org, sign in
with you Github account and give travis access to this repository and
configure it to run on pull requests. Then you should see the result here
next to the merge-conflict-indicator.

If you don't say otherwise I will just cherry-pick your unittests.

go ahead! thanks. I think i spend some time trying to set up a sensible
notmuch test database to deal with malformed mails. But it wasnt
completely finished..

@lucc
Copy link
Collaborator Author

lucc commented Jul 20, 2016

Do you want (me) to add more tests to this PR or are you considering it as is?

Copy link

@ghost ghost left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Multiple dependencies (on e.g. the filesystem and notmuch) were introduced by this test - which should be avoided. The reason is that if the Unit test fails, we don't know wether the bug is in notmuch, the filesystem or in our codebase.

As far as i understand unit testing, the solution is to use a mocking framework. Unfortunately, i don't understand mocking enough to implement the changes.

shutil.copytree(path.join(HERE,'corpus'), self.TESTCORPUS)

# create a temporary notmuch database
# create notmuch config file
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is a dependency on the filesystem and on notmuch. That means, this test is not an unit test, rather an functional test (i think). An Unit Test should, according to the book The Art of Unit Testing, have no real dependencies on anything. The reason is that it dependends on the correct working of the filesystem and notmuch - so if the unit test fails, we don't know whether this bug is really in OUR codebase.

@pazz
Copy link
Owner

pazz commented Dec 9, 2016

The thing is that in order to test many features it is necessary to have some dummy data (e.g. email files) to test on.

@lucc
Copy link
Collaborator Author

lucc commented Dec 9, 2016

I am not sure if we can get rid of this filesystem dependency as even the test files will be files. Also the alot code base.

Also not sure if it is possible to get rid of the dependency on notmuch. If we want to argue that our code works we have to argue that it uses the notmuch bindings correctly (because that is what happens in production). And I can also try to continue this mocking idea ad absurdum: Why not mock the other python dependencies (urwid, urwidtrees, ...) or even python itself.

Sure this is very artificial and I hope you don't take any offense from it, none was intended. I just have to learn some more about the principles behind unit testing (and have a look at that link).

We could for now also remove the unit testing commits from this PR and focus on getting the travis integration to work. It will then hopefully be easier to add linting/testing/whatnot jobs in the future.

@pazz
Copy link
Owner

pazz commented Dec 9, 2016 via email

@lucc
Copy link
Collaborator Author

lucc commented Dec 9, 2016

@pazz you are right, I overlooked that quantifycode gives us linting. The only thing I can think of right now is that travis could run make -C docs html && git diff --exit-code in order to check that the last commiter did update and commit autogenerated documentation.

# gmime is needed to build notmuch from source.
- sudo apt-get -yqq install libgmime-2.6-dev
# Clone the notmuch repository and move into it.
- git clone git://notmuchmail.org/git/notmuch
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Observation: this is always going to test against notmuch's master branch. We ought to choose explicitly what to test against. One approach: test against both master and the release branch, possibly not blocking on failures from master (since it's a moving target anyway, but at least we hear about breaking before it actually lands).

I ended up doing something like this with my golang binding to notmuch:

https://github.com/zenhack/go.notmuch/blob/master/.travis.yml

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that's cool. I will copy some of that to one of my travis related branches and will see when to integrate it (this PR or later). But I want to use this approach definitely.

def setUp(self):
self.corpus = Corpus()

def test_messagefile(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is this test actually trying to verify? It doesn't actually invoke anything from alot.

@pazz
Copy link
Owner

pazz commented Dec 10, 2016 via email

@lucc
Copy link
Collaborator Author

lucc commented Dec 10, 2016

@pazz OK I tried to do that but it did not yet work. I was hoping that job 74 would pass and job 75 would fail but both passed.

I have trouble testing all this locally because the makefile in docs/ uses python but that is python3 on Arch Linux. I will try to fix that in a separate PR that should be merged first. It is here: #912

@lucc
Copy link
Collaborator Author

lucc commented Dec 10, 2016

If we want to reduce the manual building of deps we can investigate travis-ci/travis-ci#5821

@lucc lucc added this to the release 0.5 milestone Dec 10, 2016
@zenhack
Copy link
Contributor

zenhack commented Dec 10, 2016

Re: unexpected passes, sphinx-build has a -W that treats warnings as errors; we could make use of it.

@lucc
Copy link
Collaborator Author

lucc commented Dec 11, 2016

@zenhack I have see that too and was planing to use it but my comment is more about git diff --exit-code not returning failure though I assumed there would be changes to commit after the makefile updated something. I hope I fixed that in #914. I will use -W in the travis file afterwards.

@zenhack
Copy link
Contributor

zenhack commented Dec 11, 2016

Ah, I see -- I misinterpreted what you were referring to.

@lucc
Copy link
Collaborator Author

lucc commented Dec 11, 2016

I managed to do it on top of #914. See build #89 and build #90.

What I did to simplify and speed things up is this: Install the simple dependencies with pip and mock the complicated ones. Then delete all generated files and have the makefile regenerate them. Then use git diff --exit-code to see if the commited versions were up to date.

I can put the commits from https://github.com/lucc/alot/tree/travis-check-docs-3 into this PR as soon as #914 is merged.

The question that remains is this: Do we only want to check if the generated doc sources are up to date (like in the other branch right now) or do we also want to build the complete documentation? I would then remove SPHINXBUILD=true from the make call (don't be fooled by this argument to make, it doesn't tell make to build the docs with sphinx but to use the true command in order to build it, so actually it doesn't build the html docs :)

lucc and others added 11 commits December 11, 2016 21:19
Currently alot is only installed with pip and the docs are build.  This can be
expanded to also run the test as soon as we have some.
The alot command "exit" seems not to work as a "initial_command" value.  So we
use the forceful `os._exit` to kill the alot process.  This still can show
errors during startup (loading of the alot modules).
This also adds a first nontrivial test that checks
if the notmuch bindings work as expected
@pazz
Copy link
Owner

pazz commented Dec 12, 2016

@lucc (you didn't stick around long enough on irc for me to reply):
Generally, I prefer small PRs, so I would like to see one for introducing this travis file, maybe one or two "dummy" tests as individual patches, like the ones you have for testing if the docs build.
Maybe another patch that mentions travis in the docs, i.e., in the development section explains that we expect PRs to pass all tests etc.

I think adding a unit test suite should be another series, because what I wrote there was really preliminary and your travis stuff seems stable enough.
Also, conceptually these seem like different things: one contains (mostly) semantic tests around the code, the other is about running these (and others) on some server.

The only grey area is how/where to set up a test corpus. I tried to do something in this testsuite branch but its..well preliminary. But I think this should be done in some pre-hook of the test suite in order to run tests locally as well. If we put all this into the travis file then I see no easy way to test a PR before actually submitting it and waiting for travis.

@lucc
Copy link
Collaborator Author

lucc commented Dec 14, 2016

This was superseded/resolved/fixed/done by #916 and #919. So I will close it now.

Three interesting things can be done later:

@lucc lucc closed this Dec 14, 2016
@lucc lucc deleted the travis branch December 15, 2016 11:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants