-
-
Notifications
You must be signed in to change notification settings - Fork 753
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
@feathers/cli: introduce option to choose jest for tests instead of mocha #1057
Conversation
Small gripe about codeclimate. The configuration flagged the following two bits of code as "too similar" and in need of refactoring: // from /packages/generator-feathers/generators/app/index.js
module.exports = class AppGenerator extends Generator {
// ...
prompting() {
// ...
const prompts = [{
// ...
}, {
name: 'packager',
type: 'list',
message: 'Which package manager are you using (has to be installed globally)?',
default: 'npm@>= 3.0.0',
choices: [
{ name: 'npm', value: 'npm@>= 3.0.0' },
{ name: 'Yarn', value: 'yarn@>= 0.18.0' }
]
}, {
// ...
}, { // THIS block is apparently "too similar" to the 'packager' block above
name: 'tester',
type: 'list',
message: 'Which testing framework do you prefer?',
default: 'mocha',
choices: [
{ name: 'Mocha + assert', value: 'mocha' },
{ name: 'Jest', value: 'jest' }
]
}];
// ...
}
} It prevented my PR checks from passing until I had figured out a way to get codeclimate not to see them as similar. Really the similarity between these two blocks actually makes the code easier to read and maintain. In order to get codeclimate to leave me alone (notice that 4 out of the last 5 commits in this PR are tweaks to try to please codeclimate), I had to add an unnecessary property to the second block. It meant I had to dig into the Yeoman Generator code to discover that it uses Inquirer.js to generate its questions, which in turn has good documentation (but not excellent), so I had to dig into that code to discover that I could just add a I totally get the purpose of codeclimate, but in this case, it seems to have been a hindrance to using best practices rather than a help. 😎 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is great, thank you! Just two small questions and then we can get it out.
@@ -20,51 +20,26 @@ module.exports = class ServiceGenerator extends Generator { | |||
message: 'What kind of service is it?', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are these formatting changes in this file from your linter?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No. The CodeClimate checks originally failed because "Function prompting has 91 lines of code (exceeds 80 allowed)." I actually thought the formatting was better as it was. I was looking for ways to reduce the number of lines without changing behavior. I would be happy to change it back to the way it was.
}); | ||
}; | ||
|
||
it('generic', () => testServiceGenerator('generic')); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can probably remove the other database tests here and assume everything will work fine to not make the tests run a little quicker.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool. Skipping all but generic
test reduced overall testing times on travis from over 40 minutes down to about 15 minutes. Not sure why the latest build failed, but the offending code was the authentication
package, not the generators. Also, you may already know this, but apparently you can tell lerna
to only run the tests for the packages that were most recently updated. (official docs) I had never heard of lerna
before I got into this project. It's pretty cool!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did not, that looks really interesting.
Ah yes, sorry it's a flaky test that will be fixed soon. I think moving to the monorepo some of the Codeclimate settings got screwed up, I'll have to revisit those as well. Will merge and release this later this weekend. Thanks again for doing this, great work! |
Summary
Addresses #845. (Also mentioned in #1049)
This PR affects the
@feathers/cli
package (more specifically the generators ingenerators-feathers
). In particular, the three generators affected are:feathers generate app
Gives you the option to choose
jest
instead ofmocha
for testing.feathers generate service
The auto-generated test uses whatever testing framework was selected when the app was generated.
feathers generate hook
Same as for services.
Usage should be obvious for app creation and transparent for service/hook creation. In addition to small changes in the syntax of the tests themselves (e.g.
before()
becomesbeforeAll()
and asynchronous tests requireexpect.assertions([some number])
) it was necessary to modify two other files:.eslintrc.json
had to be renamedeslintrc.json.js
and moved from thegenerators/app/templates/static/
folder to thegenerators/app/configs
folder. That's becauseeslint
has to be told which testing framework will be used in theenv
property.generators/app/configs/package.json.js
has a number of changes:NODE_ENV=
to thescripts.test
property. This was to squelch some unnecessary warning messages that were being generated when tests were run described here. Please note the space after the=
sign. I admit this is a weird thing to put in yourtest
script, but the worst that will happen if you take it out is that you either have to put up with unnecessary warnings in the console, or you have to add an unnecessarytest.json
file in the./config
folder. Another option would be to actually use thetest.json
file for something.I added aAfter I read Generate Jest tests instead of Mocha? #845 more carefully, I saw that @daffl had a better solution for figuring out what testing framework is being used so I switched to that.config.tester
property. As best as i can tell, it is totally legitimate to put arbitrary properties underconfig
inpackage.json
as long as you don't conflict with pre-defined values. This StackOverflow thread recommends namespacing any custom config properties. I can see an argument for not pollutingpackage.json
with a lot of extra config values, but this just seemed cleaner and easier than checking the keys of thescripts
object to figure out which testing framework is in use.scripts
is now dependent upon user input. It defaults tomocha
but allows the user to selectjest
instead.I believe there are zero changes that would impact any other parts of either the CLI or the generated feathers apps. FYI, it won't hurt my feelings if this PR gets modified or rejected. This is my first time contributing to the project, and I tried as best to follow the style of the code upon which my edits were based. I think feathers is a pretty fantastic contribution to the world, and I'm happy to see all the organizational work that went into it this summer. Here's my small contribution to keeping it vital!
Other Information
I added new tests based on the existing ones. All tests pass. The new ones are in
test/generators.using.jest.test.js
. I set them up to automaticallyskip()
the tests forcassandra
andrethinkdb
since they both fail if the developer doesn't have those database servers installed and running on their systems. (BTW, I'd never heard of either server, so figuring out why my code wouldn't pass those tests got me hung up for quite a bit of time 😎).Note on change to
package.json
In order to get the CI tests to pass, I had to update the
test
script inpackage.json
to use the flag--no-timeouts
instead of--timeout 30000
. On my local system, it would also work with the timeout value set to 60000, but at that point, it seemed just easier to remove the timeouts altogether.In any case, running the full lerna test suite on Node 6 OVER 40 MINUTES!! I'd like to suggest that we need to figure out a way to expedite these tests. That is too long to have to wait to find out whether there's a bug in your code or not. When running locally, I could use
--grep [pattern]
to restrict my tests to just the ones I cared about, but that doesn't work with Travis.