Skip to content
This repository has been archived by the owner on May 11, 2018. It is now read-only.

useBuiltIns: figure out how to remove polyfills that aren't actually used in the codebase #84

Closed
hzoo opened this issue Dec 12, 2016 · 50 comments

Comments

@hzoo
Copy link
Member

hzoo commented Dec 12, 2016

Goal: only include necessary polyfills

Don't include if supported environments support it natively #20

import "babel-polyfill";

gets transformed to

import "core-js/modules/es6.map";
import "regenerator-runtime/runtime";
...

Don't include if you have blacklisted it yourself (regenerator for example) #60

  • Be able to blacklist transforms or specific builtIns?
{
  "presets": [
    ["env", {
      "targets": {
        "browsers": ["last 2 versions", "safari >= 7"]
      },
      "blacklist": [
        "transform-regenerator" // won't include "regenerator-runtime/runtime"
      ]
    }]
  ]
}

Don't include if the codebase doesn't actually use the builtIn (this)

this would need to be a whole codebase transformation so it probably won't work in this preset/plugin

The problem with the first heading ^ is that even if you remove the polyfills supported natively in the environment, you might not actually use them in your codebase anyway.

Possible solution: run an analyzer (separate babel pass like transform-runtime) on the whole codebase and remove imports that aren't used anywhere.

EDIT: A big problem atm is that Babel runs per file. So unless we understand the whole codebase it sounds like something webpack could do? cc @TheLarkInn @keyanzhang

@TheLarkInn
Copy link

So here is an idea (atleast in terms of webpack). //cc @patrickkettner of Modernizr who this could be highly relevant for.

On demand async polyfill chunks

Background: ContextModules with webpack

Since webpack is a static build tool, there is no real dynamic requires, etc. However, there is a static solution to this dynamic nature.

Given the following builtins are statically located somewhere:

--/builtins
  --/somePoly1.js
  --/somePoly2.js
  --/somePoly3.js

One can pass an expression to a dependency statement. Specifically System.import() or import() (for async chunking).

function asyncLoadPoly(builtInUniqueName) {
  return `import('./builtins' + builtInUniqueName)`
}

webpack will create a separate chunk for each possible module that could be loaded from that folder.

This will then return a Promise and one can load that specific poly at runtime:

if (someLogicAboutSpecificJSFeature) {
  asyncLoadPoly('somePoly1.js').then(
    return module.default; 
  ) 
}

Idea

Use this pattern to "dynamically" require x polyfills based on a series of fast logic operations that async load async polyfill chunks as they are needed through out the web application using the above technique.

Challenges

  • Slow builds: If you are using a crap ton of polys, then you are generating a huge context module with many many asnyc chunks. This could have slower perceived build times.

Wins

  • Performance (in theory?)
    • you are technically not building as much up front
  • Don't have to create builds specifically for each browser

Really this is just one idea that we could leverage, however it could be powerful if we determine the right logic cases up front and what users are targeting.

@EnoahNetzach
Copy link

@TheLarkInn on demand chunks means that the build step is faster, but runtime it has to be blocking, so more waiting for the client.

@hzoo how can you analyze for method usages (e.g. foo[bar()].includes())?

@TheLarkInn
Copy link

Well to create context modules it may be slower but I guess the blocking part really is about where the polyfill is needed for which parts of your code.

@EnoahNetzach
Copy link

So the problem could be reduced to usage detection, then replace the single top-file import of babel-polyfills with multiple imports where they are actually used?

I guess this would be possible with ponyfills.

Another solution could be to make the polyfills as lazy chunk loaders so that the first time it is used runtime it gets the actual polyfill and replace itself.
It sounds convoluted, thou.

@EnoahNetzach
Copy link

EnoahNetzach commented Dec 12, 2016

Something like

featuresToPolyfill.map(({ object, feature }) => {
  object.prototype[feature] = (...args) => {
    object.prototype[feature] = syncFetchPolyfill(object, feature)
    return object.prototype[feature](...args)
  }
})

@hzoo
Copy link
Member Author

hzoo commented Dec 12, 2016

@EnoahNetzach So for prototype methods we assume any usage of it would count as the built-in.

arr.includes is using Array.prototype.includes but we don't know what arr is without type information (inferred or annotated). We just check the .includes part. So it won't have false negatives but it may have false positives?

@EnoahNetzach
Copy link

@hzoo exactly, that's the problem

@hzoo
Copy link
Member Author

hzoo commented Dec 12, 2016

Is it really a problem though (it is if we want to do the best thing possible, but were just talking about an incremental improvement here)? My point is that doing that is better than not doing the check at all. It just doesn't save as much if there are false positives.

@EnoahNetzach
Copy link

Oh, I get it.
Well, in this case is far better than having them all always; worst case scenario it doesn't change anything.

But what if someone uses something like this?

function troll() { return 'includes'; }
obviousArray[troll()]();

@hzoo
Copy link
Member Author

hzoo commented Dec 12, 2016

Exactly! better than worse-case

So that's different lol. It's just a straightforward AST check and we aren't going to support something like that - just add a disclaimer since it's not normal to do that.

Or more that people that are willing to use this will want to make sure they aren't doing anything like that.

@EnoahNetzach
Copy link

Or maybe leave the option to forcefully import something, if known:

import `babel-polyfill/a`
import `babel-polyfill/b`

x[f()]()

where the programmer knows that f() could return either a or b.

@DrewML
Copy link
Member

DrewML commented Dec 12, 2016

we aren't going to support something like that - just add a disclaimer since it's not normal to do that.

We do a similar thing (add a disclaimer to the docs) for the Object.assign transform

@hzoo
Copy link
Member Author

hzoo commented Dec 12, 2016

We can include polyfills in the whitelist/blacklist options if necessary

@zloirock
Copy link
Member

@TheLarkInn unrealistically.

@zloirock
Copy link
Member

@hzoo as I wrote, the main problem - we should also parse all used dependencies and should know all required features at the moment of transformation.

@hzoo
Copy link
Member Author

hzoo commented Dec 12, 2016

Yeah I don't think we have a way to do that yet standalone unless we do something like make the preset as 2 passes and then write the used polyfills into a /tmp directory and read it in the 2nd pass (and pass to useBuiltIns)

@mikesherov
Copy link

mikesherov commented Dec 13, 2016

Would it be crazy to consider building out UA specific chunks, and have a special loader that does a UA sniff for which polyfills to load? Kind of like a local polyfill.io?

@hzoo
Copy link
Member Author

hzoo commented Dec 13, 2016

I don't think so - it seems like a lot of folks are interested?

@patrickkettner
Copy link

to be clear, I think UA specific builds is dangerous. Feature specific builds (i.e. target es3, es5, etc) is great, but saying here is a chrome-build, here is a IE-build, etc is footguns waiting to be fired

@mikesherov
Copy link

@patrickkettner not for IE. It's a terminal browser that is no longer being updated with a static set of feature.

@hzoo
Copy link
Member Author

hzoo commented Dec 13, 2016

I think the main win is targeting non-evergreen browsers (ie 11, safari 9)? Otherwise many would be supporting something like last 2 browsers which we can target via browserslist

@Jessidhia
Copy link
Member

Jessidhia commented Dec 13, 2016

Runtime dynamic loading of polyfills should never be done automatically.

The best case scenario would be to have a generated "polyfill-loader.js" file with an async function that does feature tests and dynamic imports for the ones that fail; but that would have to be explicitly called by the user, at runtime, with the knowledge that it will return a Promise. Polyfilling Promise in this case is an exercise for the reader 😉

It is also not safe to execute any javascript until that Promise resolves. Any javascript that runs before all polyfills are loaded may be able to capture non-polyfilled objects, crash or misbehave due to lack of polyfill, do faulty feature checking, or even install / use their own shim or polyfill in response.

If your main JS is critical for rendering, you will also get a flash of unrendered content until the polyfills finish loading and it's safe to run JS. At least sync JS doesn't have the potential to show a broken page until it finishes loading (depending on where you place the script tag).

@EnoahNetzach
Copy link

After all I think babel should just have to be able to eliminate dead imports, probably by a simple ast transform per file, then bundlers should have the honor to decide what's really needed when.

It's not a transpiler work to figure out the shape of what the client needs (e.g. chunks), but to give to instruments down the line the tools necessary to make this decision, otherwise I see a duplication of concerns (and compatibility issues).

@hzoo
Copy link
Member Author

hzoo commented Dec 13, 2016

be able to eliminate dead imports, probably by a simple ast transform per file

Not sure how that would work? Currently we would import the polyfill once and the issue was that you might not use that method/polyfill in the whole app (so you can't do a per file check)

@aickin
Copy link

aickin commented Dec 13, 2016

Hey there, Babel folks!

I've been working on something that basically does this kind of static analysis, and I just made the code public since you all are discussing solving this problem. It's called babel-plugin-which-builtins, and it searches the AST for references to new ECMAScript globals, static methods, and instance methods and adds an import for the correct polyfill for each.

I think it could be used on individual modules pre-bundling, as the bundler should deal effectively with multiple files that all import the same polyfill.

Unfortunately, my plugin probably doesn't work well with this project's current strategy of expanding import 'babel-polyfill' to a bunch of import declarations that are needed by the env. It would, however, work with a (hypothetical) plugin that did the opposite: remove the polyfill imports that aren't needed by the preset. In this theoretical setup, babel-plugin-which-builtins would add the polyfill imports needed by the code, and then babel-preset-env would remove the imports not needed by the current env. Does that make sense?

It's still very new (just been working on it a week or two), but it's got some tests and I think covers many (most?) basic cases. One case I know that is probably worth covering that is not currently covered is destructuring as property access, like: <example snipped>

If y'all want, I'd be happy to offer it under a compatible license. Please let me know what you think, and thanks for all your great work!

@EnoahNetzach
Copy link

@hzoo the transformer could remove the "global" babel-polyfill import and check for method usages that have to be polyfilled, then add the appropriate polyfill import in that module.

The so-imported module would just have to pollute the global with polyfills.

@aickin
Copy link

aickin commented Dec 14, 2016

Sorry for the double post, but I've been thinking about this problem for the last day or so, and I think I don't understand something that was said above:

this would need to be a whole codebase transformation so it probably won't work in this preset/plugin
A big problem atm is that Babel runs per file. So unless we understand the whole codebase it sounds like something webpack could do?

I'm probably missing something, but it's not clear to me why this is the case. I think that the output of this preset should be adding the "exactly correct" set of imports, which is the intersection of (builtins used by the code) ∩ (builtins NOT supported natively by env).

It seems to me that as long as the preset adds this exact set of imports, it could be applied at either the file level or at the codebase level. Am I missing something here?

If I'm right, then the preset needs to do a subtraction step to make the intersection. This could be accomplished a few ways:

  1. Use babel-plugin-which-builtins, which adds imports for the set of (builtins used by the code), then write a plugin that removes any builtin import supported natively by the env.
  2. Use the current babel-preset-env with useBuiltins, which adds imports for the set of (builtins NOT supported natively by env), then write a plugin that removes any builtin import NOT used by the code.
  3. Combine the logic of these two plugins so that the intersection is done before the imports are added to the code.

Am I on the right track here? Thanks for you consideration!

@kaisermann
Copy link

I'm also looking for the same functionality as @romulof.

@romulof
Copy link

romulof commented Feb 28, 2017

I ended up disabling useBuiltIns and reverting to babel-plugin-transform-runtime.
My bundle even shrunk a few kilobytes.

@kaisermann
Copy link

@romulof Would you care to explain how you did it? Thanks!

@romulof
Copy link

romulof commented Mar 2, 2017

I leave all my Babel config inside webpack config, and all build parameters are processed by env vars using https://github.com/af/envalid.

Here's it's config object. I think it's pretty self-explanatory:

      {
        cacheDirectory: env.isDev,
        presets: [
          'react',
          ['env', {
            debug: true,
            modules: false, // Using webpack 2
            targets: {
              // browserlist var is shared with autoprefixer
              browsers: browserlist,
            },
            exclude: [
              // Leave regenerator to transform-runtime
              'transform-regenerator',
            ], 
          }],
        ],
        plugins: compact([
          'lodash',
          'transform-decorators-legacy',
          'transform-class-properties',
          'transform-export-extensions',
          'transform-object-rest-spread',
          env.isProduction && 'transform-react-remove-prop-types',
          env.isProduction && 'transform-react-constant-elements',
          env.isProduction && 'transform-react-inline-elements',
          env.isProduction && 'transform-dev-warning',
          ['transform-runtime', {
            // Don't use regenerator in dev builds. Hard to debug.
            regenerator: env.isProduction,
          }],
          env.isDev && 'react-hot-loader/babel',
          env.isDev && 'transform-react-jsx-self',
          env.isDev && 'transform-react-jsx-source',
        ]),
      }

Right now this is working for me, but might not apply to all cases.

@yavorsky
Copy link
Member

yavorsky commented Mar 2, 2017

@romulof Nice! Why are you not using env field for babel configuration?https://babeljs.io/docs/usage/babelrc/#env-option

{
"plugins": ["transform-decorators-legacy", "other common plugins"], // common plugins
"env": {
    "production": {
      "plugins": ["regenerator"] // plugins for prod
    },
    "development": {
      "plugins": ["react-hot-loader/babel"] // plugins for dev
    }
  }
}

@romulof
Copy link

romulof commented Mar 2, 2017

It's just to make the whole Webpack config uniform. In the end it works the same way.

PS: compact is from Lodash. It clears undefined/null/false entries from arrays.

@kaisermann
Copy link

kaisermann commented Mar 7, 2017

Ok, got it working more or less. But transform-runtime STILL does not polyfill only what is listed by babel-preset-env. I mean, it keeps, for example, polyfilling Object.assign() even if I'm targeting chrome: 52. Perhaps I understood something wrong?

My babel configuration:

{
  "presets": [
    ["env", {
      "modules": false,
      "loose": true,
      "debug": true,
      "targets": {
        "browsers": "chrome 52",
      }
    }]
  ],
  "plugins": [
    ["transform-runtime", {
      "helpers": false,
      "polyfill": true,
      "regenerator": true,
      "moduleName": "babel-runtime"
    }],
    ["external-helpers"]
  ]
}

If it is relevant in any way, I'm bundling with rollup

@romulof
Copy link

romulof commented Mar 8, 2017

transform-runtime is a separate Babel plugin. It transforms ES6/7/next calls to babel-polyfill libraries. You can disable some of this features using it's own configuration.

@hzoo
Copy link
Member Author

hzoo commented Mar 30, 2017

Ok I think i'm going to try what @aickin was talking about. It's actually just what transform-runtime already does, except we are now going to add the polyfill instead of rewriting to reference babel-runtime. So we just add the used imports and then remove the ones provided by the env. Will post an example later, still wip

@hzoo
Copy link
Member Author

hzoo commented Mar 31, 2017

Ok I made #241

only issue not covered in #84 (comment) is regarding the rest of node_modules (3rd party). Workaround is a disclaimer of the option and to use includes instead. We can even warn if you include but it's natively supported

Please check the tests and leave feedback!

@hzoo
Copy link
Member Author

hzoo commented Apr 11, 2017

Merged ^, testing in v2.0.0-alpha.5

@hzoo
Copy link
Member Author

hzoo commented Apr 13, 2017

Closing in favor of a new issue I made: #284

since we kinda got this working but need some more work for 3rd party modules

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests