Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

@feathers/cli: introduce option to choose jest for tests instead of mocha #1057

Merged
merged 17 commits into from
Oct 24, 2018
Merged

@feathers/cli: introduce option to choose jest for tests instead of mocha #1057

merged 17 commits into from
Oct 24, 2018

Conversation

morphatic
Copy link
Contributor

@morphatic morphatic commented Oct 15, 2018

Summary

Addresses #845. (Also mentioned in #1049)

This PR affects the @feathers/cli package (more specifically the generators in generators-feathers). In particular, the three generators affected are:

  1. feathers generate app
    Gives you the option to choose jest instead of mocha for testing.
  2. feathers generate service
    The auto-generated test uses whatever testing framework was selected when the app was generated.
  3. feathers generate hook
    Same as for services.

Usage should be obvious for app creation and transparent for service/hook creation. In addition to small changes in the syntax of the tests themselves (e.g. before() becomes beforeAll() and asynchronous tests require expect.assertions([some number])) it was necessary to modify two other files:

  1. .eslintrc.json had to be renamed eslintrc.json.js and moved from the generators/app/templates/static/ folder to the generators/app/configs folder. That's because eslint has to be told which testing framework will be used in the env property.
  2. generators/app/configs/package.json.js has a number of changes:
    1. I added NODE_ENV= to the scripts.test property. This was to squelch some unnecessary warning messages that were being generated when tests were run described here. Please note the space after the = sign. I admit this is a weird thing to put in your test script, but the worst that will happen if you take it out is that you either have to put up with unnecessary warnings in the console, or you have to add an unnecessary test.json file in the ./config folder. Another option would be to actually use the test.json file for something.
    2. I added a config.tester property. As best as i can tell, it is totally legitimate to put arbitrary properties under config in package.json as long as you don't conflict with pre-defined values. This StackOverflow thread recommends namespacing any custom config properties. I can see an argument for not polluting package.json with a lot of extra config values, but this just seemed cleaner and easier than checking the keys of the scripts object to figure out which testing framework is in use. After I read Generate Jest tests instead of Mocha? #845 more carefully, I saw that @daffl had a better solution for figuring out what testing framework is being used so I switched to that.
    3. The name of the testing property that gets added to the scripts is now dependent upon user input. It defaults to mocha but allows the user to select jest instead.

I believe there are zero changes that would impact any other parts of either the CLI or the generated feathers apps. FYI, it won't hurt my feelings if this PR gets modified or rejected. This is my first time contributing to the project, and I tried as best to follow the style of the code upon which my edits were based. I think feathers is a pretty fantastic contribution to the world, and I'm happy to see all the organizational work that went into it this summer. Here's my small contribution to keeping it vital!

Other Information

I added new tests based on the existing ones. All tests pass. The new ones are in test/generators.using.jest.test.js. I set them up to automatically skip() the tests for cassandra and rethinkdb since they both fail if the developer doesn't have those database servers installed and running on their systems. (BTW, I'd never heard of either server, so figuring out why my code wouldn't pass those tests got me hung up for quite a bit of time 😎).

Note on change to package.json

In order to get the CI tests to pass, I had to update the test script in package.json to use the flag --no-timeouts instead of --timeout 30000. On my local system, it would also work with the timeout value set to 60000, but at that point, it seemed just easier to remove the timeouts altogether.

In any case, running the full lerna test suite on Node 6 OVER 40 MINUTES!! I'd like to suggest that we need to figure out a way to expedite these tests. That is too long to have to wait to find out whether there's a bug in your code or not. When running locally, I could use --grep [pattern] to restrict my tests to just the ones I cared about, but that doesn't work with Travis.

@morphatic
Copy link
Contributor Author

Small gripe about codeclimate. The configuration flagged the following two bits of code as "too similar" and in need of refactoring:

// from /packages/generator-feathers/generators/app/index.js
module.exports = class AppGenerator extends Generator {
  // ...
  prompting() {
    // ...
    const prompts = [{
      // ...
    }, {
      name: 'packager',
      type: 'list',
      message: 'Which package manager are you using (has to be installed globally)?',
      default: 'npm@>= 3.0.0',
      choices: [
        { name: 'npm',  value: 'npm@>= 3.0.0'   },
        { name: 'Yarn', value: 'yarn@>= 0.18.0' }
      ]
    }, {
      // ...
    }, { // THIS block is apparently "too similar" to the 'packager' block above
      name: 'tester',
      type: 'list',
      message: 'Which testing framework do you prefer?',
      default: 'mocha',
      choices: [
        { name: 'Mocha + assert', value: 'mocha' },
        { name: 'Jest',           value: 'jest'  }
      ]
    }];
    // ...
  }
}

It prevented my PR checks from passing until I had figured out a way to get codeclimate not to see them as similar. Really the similarity between these two blocks actually makes the code easier to read and maintain. In order to get codeclimate to leave me alone (notice that 4 out of the last 5 commits in this PR are tweaks to try to please codeclimate), I had to add an unnecessary property to the second block. It meant I had to dig into the Yeoman Generator code to discover that it uses Inquirer.js to generate its questions, which in turn has good documentation (but not excellent), so I had to dig into that code to discover that I could just add a pageSize: 7 property to the question object, which essentially does nothing to the functionality but fools codeclimate into thinking the two blocks are not "too similar" anymore.

I totally get the purpose of codeclimate, but in this case, it seems to have been a hindrance to using best practices rather than a help. 😎

Copy link
Member

@daffl daffl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great, thank you! Just two small questions and then we can get it out.

@@ -20,51 +20,26 @@ module.exports = class ServiceGenerator extends Generator {
message: 'What kind of service is it?',
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are these formatting changes in this file from your linter?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No. The CodeClimate checks originally failed because "Function prompting has 91 lines of code (exceeds 80 allowed)." I actually thought the formatting was better as it was. I was looking for ways to reduce the number of lines without changing behavior. I would be happy to change it back to the way it was.

});
};

it('generic', () => testServiceGenerator('generic'));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can probably remove the other database tests here and assume everything will work fine to not make the tests run a little quicker.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool. Skipping all but generic test reduced overall testing times on travis from over 40 minutes down to about 15 minutes. Not sure why the latest build failed, but the offending code was the authentication package, not the generators. Also, you may already know this, but apparently you can tell lerna to only run the tests for the packages that were most recently updated. (official docs) I had never heard of lerna before I got into this project. It's pretty cool!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did not, that looks really interesting.

@daffl
Copy link
Member

daffl commented Oct 19, 2018

Ah yes, sorry it's a flaky test that will be fixed soon. I think moving to the monorepo some of the Codeclimate settings got screwed up, I'll have to revisit those as well. Will merge and release this later this weekend. Thanks again for doing this, great work!

@daffl daffl merged commit 1356a1c into feathersjs:master Oct 24, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants