Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix duplicate rules #61

Merged
merged 2 commits into from
Nov 21, 2015
Merged

Fix duplicate rules #61

merged 2 commits into from
Nov 21, 2015

Conversation

joshwnj
Copy link
Member

@joshwnj joshwnj commented Nov 20, 2015

In certain cases we will get duplicates, and this causes issues with composes.

@joeybaker and @loklaan can you please take a look here as this affects some code in your recent pull requests.

I've also added a new test case compose-from-shared which previously fails, but passes with this PR.

Object.keys(sources).forEach(function (key) {
var hash = stringHash(sources[key]);
if (foundHashes[hash]) {
delete sources[key];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor suggestion: This is silly, but using delete forces V8 to use un-optimized objects, resulting in a speed loss. It's a minor thing that we might not even notice, but an alternative way to write this might be:

function dedeupSources (loader) {
var dedupedSources = {}
loader.sources = Object.keys(loader.sources).reduce(function (sources, key) {
  var hash = stringHash(loader.sources[key])
  if (dedeupedSources[hash]) {
    return sources
  }
  else {
   dedupedSources[hash] = true
   sources[key] = loader.sources[key]
   return sources
  }
})

It's a small optimization, but might matter in large code bases.

@joeybaker
Copy link
Contributor

@joshwnj one very minor suggestion, but seems like a great idea to me!

@joshwnj
Copy link
Member Author

joshwnj commented Nov 21, 2015

HI @joeybaker, that's a good suggestion, I wasn't aware of the performance issue there, TIL :)

I'd like to tackle performance as a separate PR, and look at the whole picture rather than piecemeal. So I'll make a new issue for that, it'd be great to get your insight there on how we can make it faster.

joshwnj added a commit that referenced this pull request Nov 21, 2015
@joshwnj joshwnj merged commit 69e79d4 into master Nov 21, 2015
@joshwnj joshwnj deleted the fix-duplicate-rules branch November 21, 2015 03:54
@joshwnj
Copy link
Member Author

joshwnj commented Nov 21, 2015

Published v0.15.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants