Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Testing before pushing commits #13011

Closed
SheetJSDev opened this issue Apr 2, 2019 · 4 comments · Fixed by #13018
Closed

Testing before pushing commits #13011

SheetJSDev opened this issue Apr 2, 2019 · 4 comments · Fixed by #13018

Comments

@SheetJSDev
Copy link
Contributor

First off, awesome work!

For those of us eager to contribute, the starting point is to set up the local environment and run the existing test suite. Two commits in the last 24 hours broke the tests in different was:

69af92d does not pass the prettier check for two reasons: the ordered list format and an errant double space.

fca353b does not pass the eslint check for some errant apostrophes which should have been encoded as ' -- the relevant fix is in #12988

Of course this probably is not representative of the experience as a whole, but maybe there should be some sort of testing before allowing commits to land on master, to ensure that all tests do pass

@polarathene
Copy link
Contributor

maybe there should be some sort of testing before allowing commits to land on master, to ensure that all tests do pass

@SheetJSDev There is already quite extensive testing in place.

View the related PR for the commit you referenced, scroll down to the merge commit prior to the original branch being deleted, click view details.

All PRs go through a fair amount of tests. @DSchau is a core member of Gatsby, and it looks like he was the only one involved in the PR, all tests had passed and so he merged it. Oddly, unlike most PRs the windows unit tests aren't listed there, and I know that one is raising the prettier lint error on the file(in my own PR it's the only failing test), so it is technically covered, just somehow missed in that PR.

@polarathene
Copy link
Contributor

I just looked at the CircleCI test report for a recent PR of mine yet to be merged. It's passing, but fails on the Windows test. I can also get the failure on my Linux machine, so something is right with the CircleCI test environment? Is it possible a cache is giving a false positive?

> @ prettier /home/circleci/project
> prettier "**/*.{md,css,scss,yaml,yml}" "--check"

Checking formatting...
All matched files use Prettier code style!

Whereas for the same test suite the Windows Azure Pipeline tests failed at this point:

> prettier "**/*.{md,css,scss,yaml,yml}" "--check"

Checking formatting...
docs\blog\2019-03-29-interview-with-david-eads\index.md
Code style issues found in the above file(s). Forgot to run Prettier?

@polarathene
Copy link
Contributor

CircleCI appears to be throwing the error now too, so that's resolved?

Put up a PR for the other issue you mentioned regarding prettier.

@polarathene
Copy link
Contributor

There was an earlier PR for the eslint rule to resolve this, but the PR did not wait for tests to finish before merging. The PR instead caused a warning to become an error in CI, I've addressed that as well in my PR.

Of course this probably is not representative of the experience as a whole, but maybe there should be some sort of testing before allowing commits to land on master, to ensure that all tests do pass

In both cases it was due to core devs having the rights to approve and merge their own PRs. From what it looks like they've not waited on tests to complete, with Windows taking the longest afaik, under the understanding their change was non-breaking so it should be safe to merge without waiting I assume.

Perhaps a PR bot could react to some keyword to merge the PR if the tests pass. So that if a core dev wants to merge it and not wait on tests to get it over with and move on, they could instead trigger that as a set and forget provided everything passes.

It would be safer, and avoid these situations?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants