-
-
Notifications
You must be signed in to change notification settings - Fork 35.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Moving to a modular architecture #4776
Comments
One advantage is that this would enforce a modular architecture for the ongoing development of three.js. The common style in node/browserify has each file declare its dependencies at the top, and considers global variables an anti-pattern. Here is an example snippet: // src/geometry/BoxGeometry.js
var Geometry = require('./Geometry.js');
var Vector3 = require('../core/Vector3.js');
module.exports = BoxGeometry;
function BoxGeometry() {
// ...
}
BoxGeometry.prototype = Geometry.prototype; Another advantage is that consumers of This could be done in a way that imposes no breaking changes. At the top level, we would export all classes we consider to be part of the standard package // src/three.js
var THREE = { rev: 101 }
module.exports = THREE
THREE.Geometry = require('./geometry/Geometry.js')
THREE.BoxGeometry = require('./geometry/BoxGeometry.js')
// ... note: I'm not exactly requiring the dependencies at the top in this example, because this file would almost exclusively be require statements. Finally we would wrap that in a Universal Module Definition that detects if a module system (node/browserify, AMD) is in use, and if so exports it, or otherwise appends it to the global object ( Lets review:
|
This would require replacing the build system, but the new one would be pretty straight forward. |
Some other advantages:
|
@shi-314 I guess I'm a little confused, I feel like |
One practice that three.js uses that makes it tricky to use in commonjs environments is the use of This is because in an application you often end up with different versions of the same library in your source tree, so checking instanceof doesn't work between different versions of the same library. It would be good in preparation for a move to a commonjs module system to replace those instanceof checks with feature-checking behind a |
@kumavis I am talking about things built in three.js. Let's say you want to create your own material with your shaders etc. At the moment you need to extend the global THREE object to stay consistent with the rest of the three.js code: THREE.MeshMyCoolMaterial = function (...) { ... } But if we had Browserify than you could do: var MeshLambertMaterial = require('./../MeshLambertMaterial');
var MeshMyCoolMaterial = function (...) {...} So your namespace stays consistent and you don't need to use And with |
This is a simple way to make THREE support browserify at a global level. There is ongoing discussion on reworking the build system to directly take advantage of browersify for builds, but I don't think there is consensus on that yet. See: mrdoob#4776
@shi-314 thank you, that is more clear. That does impact my proposed general solution to deserialization consumer-defined classes: // given that `data` is a hash of a serialized object
var ObjectClass = THREE[ data.type ]
new ObjectClass.fromJSON( data ) This is from my proposed serialization / deserialization refactor |
Performance shouldn't be affected by a change like this. |
This is a pretty huge change but I'm also in favour of it. Some other major advantages:
To @mrdoob and the other authors; if you don't have much experience with NPM/Browserify I would suggest making a couple little projects with it and getting a feel for its "philosophy." It's very different from ThreeJS architecture; rather than big frameworks it encourages lots of small things. |
Another advantage of this approach is that there can be an ecosystem of open source, third party Three.JS modules, especially shaders, geometries, model loaders etc. Published through NPM or Github/Component which people can then easily just reference and use. At the moment stuff is shared by hosting a demo which people then 'view source' on. Three.JS deserves better! I think one of the problems I have with Three.JS is how quickly code becomes incompatible with the current version of Three.JS. Another advantage of switching to something like this is being able to specify specific versions of bits of Three.JS would be very powerful and handy. +1 |
+1 for a CommonJS/browserify architecture, it would make the core more lightweight and extensions would fit even if they come from third-parties |
Fragmenting three.js into little modules has a lot of costs as well. The current system allows pretty simple third party addons (witness eg. jetienne's THREEx modules). There's a lot to be said about the simplicity of the current setup, as long as the JS module systems are just wrappers around build systems. Another way of minimizing build size is what ClojureScript does. They follow some conventions to allow Google's Closure compiler to do whole-program analysis and dead code elimination. |
+1 for the unappreciated, and often overlooked, elegance of simplicity |
+1 |
The idea here is that a UMD build would still be provided for non-Node environments. Plugins like THREEx would work the same way for those depending on ThreeJS with simple The tricky thing will be: how do we
ThreeJS's current plugin/extension system is pretty awful to work with, and far from "simple" or easy. Most ThreeJS projects tend to use some form of plugin or extension, like EffectComposer, or FirstPersonControls, or a model loader, or one of the other many JS files floating around in the
Now, imagine, with browserify you could do something like this: var FirstPersonControls = require('threejs-controls').FirstPersonControls;
//more granular, only requiring necessary files
var FirstPersonControls = require('threejs-controls/lib/FirstPersonControls'); Those plugins will |
I've been using CommonJS for THREE.js projects for a bit now. It's a bit of a manual process, converting chunks of other people's code into modules and I don't think there'll be an easy way to avoid that for legacy code that isn't converted by the authors or contributors. The important bit is that there's a module exporting the entire 'standard' THREE object, which can then be required by anything that wishes to extend it. var THREE = require('three');
THREE.EffectComposer = // ... etc, remembering to include copyright notices :) This has worked pretty well for me, especially as the project grows and I start adding my own shaders and geometries into their own modules etc. As long as there's a 'threejs-full' or 'threejs-classic' npm package then this becomes a pretty viable way of working with old Three.js stuff in a CommonJS environment but I suspect this is pretty niche! |
+1
|
It could also make the shaders made modular as well, e.g. using glslify. Even things like making an Express middleware that generates shaders on demand becomes easier then. |
Some months ago I moved frame.js to require.js and I finally understood how this AMD stuff works. I still need to learn, however, how to "compile" this. What's the tool/workflow for generating a |
I prefer gulp.js as a build system with the gulp-browserify plugin. It's really easy to understand and the code looks cleaner than grunt in my opinion. Check this out: http://travismaynard.com/writing/no-need-to-grunt-take-a-gulp-of-fresh-air 😉 |
some thoughts: (based on my limited experience with node, npm, browserify of course)
that said, following the discussion on this thread, i'm not sure if everyone had the same understanding of browserify (browserify, commonjs, requirejs, amd, umd are somewhat related although they may not necessary be the same thing). now if you may follow my chain of thoughts a little.
There's where Browserify comes into the picture. Well, technically one can use requireJS in the browser. But you wish to bundle js files together without making too many network calls (unlike file system require()s which are fast). There's where Browserify does some cool stuff like static analysis to see which modules needs to be imported and creates build which are more optimized for your application. (There are limitations of course, it probably can't parse require('bla' + variable)) it can even swap out parts which requires an emulation layer for node.js dependent stuff. yeah, it generates a js build which i can now include in my browser. Here are some of the stuff browserify can do https://github.com/substack/node-browserify#usage Sounds like everything's great so far... but there are a few points I thought worth considering we move to a "browserify architectural"
So if we see this diversity and convenient module loading (mainly riding on the npm ecosystem) along with customized builds a great thing, then it might be worth a short is having a change in paradigm, refactoring code and changing our current build system. |
@mrdoob some tools around browserify are listed here: https://github.com/substack/node-browserify/wiki/browserify-tools. regarding the i am currently wrapping three.js with
and
then i wrap extensions with
so i can require it like that:
so this is not optimal, but i personally can actually live with it and think its not so bad - though @kumavis proposal would be a huge advantage. but maybe it would make sense to fork three and put all the things in seperate modules just to see how it would work out. also checkout http://modules.gl/ which is heavily based on browserify (though you can use every module on its own without browserify). |
@mrdoob @shi-314 gulp-browserify has been blacklisted in favour of just using browserify directly (i.e. via vinyl-source-stream). Tools like grunt/gulp/etc are constantly in flux, and you'll find lots of differing opinions. In the end it doesn't matter which you choose, or whether you just do it with a custom script. The more important questions are: how will users consume ThreeJS, and how much backward-compatibility do you want to maintain? After some more thought, I think it will be really hard to modularize everything without completely refactoring the framework and its architecture. Here are some problems:
My suggestion? We consider two things moving forward:
I'd love to see everything modularized, but I'm not sure of an approach that's realistic for ThreeJS. Maybe somebody should some experiments in a fork to see how feasible things are. |
Thanks for the explanations guys! What I fear is complicating things to people that are just starting. Forcing them to learn this browserify/modules stuff may not be a good idea... |
Would have to agree with @mrdoob here. I, and a lot of colleagues, are not web programmers (rather VFX/animation TDs). Picking up WebGL and Three has certainly been enough work as is on top of our current workload (and in some cases some of us had to learn js on the spot). Much of what I have read in this thread, at times, makes me shudder thinking about how much more work would be added to my plate if Three moved to this structure. I could be wrong but that is certainly how it reads to me. |
With a precompiled UMD ( |
+1 for moving to a modular architecture. |
+1 |
1 similar comment
+1 |
@drcmda is right. ES6 modules have a initialization step and an execute step which allows circular references. However, as soon as you're having circular dependencies directly from the module execution context (in the global area of a module) then the first one loaded during runtime will experience undefined values for it's dependencies. As long as the references are used within different context where the runtime execution order matters, circular dependencies are no problem. |
I suggest to consider also webpack instead of browserify. |
@gionkunz we have circular references in the initialization step bc of the pattern where theres a closure to generate scratch variables |
The beta of Webpack 2 has just been released (https://twitter.com/TheLarkInn/status/747955723003322368/photo/1), so es6 modules could also benefit from tree shaking when bundled. |
@mrdoob This is maybe the most favourite project on Github for me personally - i sincerely hope priorities will be re-considered.
|
Hmm, I would like to know more details about browser support. Which browsers do and which don't. For the browsers that don't, what are the workarounds and what are the performance penalties. |
Actually, browser support becomes a non-issue (perhaps even less so than it is now). The build systems take that ES6 code and transpile it to es5 (sometimes taking up less space than the original ES5 would have). Certain kinds of transpiled things end up being large (primarily: generators and async functions), but if you avoid those, you won't have that penalty. As @drcmda mentioned, the build system would still produce a monolithic output (and would be very easy to customize exactly what is included in that output), but the individual modules could also be included in our own projects, thus only using the parts that we need. To take full advantage of There are a few build systems. My vote would be webpack (which uses babel for the transpiling). With babel, you can define custom loaders, so the chunking system you developed for shaders could be reduced to actual glsl code with a #include extension (I actually do my shaders this way, and would be happy to contribute it to the project). This gets the same benefits of your system (no code duplication), but is very simple to use. I would love to be part of the modularization project, but I do know that this will not be successful in any way without your support (and likely assistance). Many of us know how to use the library, but none of us know how it works internally to the extent that you do. |
How large? |
Also, you didn't talk about performance penalty. Is that not an issue then? |
As far as I can see, ES6 Imports are still not supported by any browser, so this module refactor would mainly be for build systems, right? |
Don't forget about the benefits you get by using tools like rollupjs, this would automatically exclude all the exports a user doesn't use. (Which is default with JSPM) |
The babel-polyfil package, which is only necessary if you are using As far as performance, it really depends on exactly which features you are ES6 imports/exports are not supported by browsers, but since it goes Another thing to note is final build size. Currently, things like Geometry, Rollup (which eliminated unused code from a final build) is the default On Thu, Jul 7, 2016 at 1:28 PM, Mr.doob notifications@github.com wrote:
|
Not sure if this is terribly helpful, but this is the discussion thread for D3 regarding the same issue: d3/d3#2220. D3 4.0 has adopted ES6 import/export to manage modules, but is still written in ES5 (d3/d3#2220 (comment)). |
Very interesting @jpweeks! So... with this import/export approach... How would stuff like |
src/Objects/Mesh.js // Mesh class, stays the same as today (except the export part)
var Mesh = function ( geometry, material ) {
// ...
}
export default Mesh src/Three.js // Library entry point, exports all files using som bundling tech
// In a "THREE" namespace for browsers
// As import three from 'three' in node
import Mesh from './objects/Mesh'
export {Mesh} // All three objects, such as Geometry, Material etc.. Application.js // In node
import {Mesh} from 'three'
var mesh = new Mesh(geo, mat)
console.log(mesh instanceof Mesh) // true Client.js // In a browser
var mesh = new THREE.Mesh(geo, mat)
console.log(mesh instanceof THREE.Mesh) // true |
That's super helpful @GGAlanSmithee! Thanks! I'm a visual person so pseudo-code examples convince me more than big chunks of text 😅 Right, so it will require a bit of refactoring... Does anyone know if closure compiler is planning on supporting this? |
I got you! Since this thread got lively over the last couple of days I've been working a bit more on three-jsnext. It's a project that takes the existing Three.js codebase and turns it into ES modules automatically. Just wrangling with a couple of tricky cyclical dependencies (particularly around |
Ok, I've opened a pull request for this: #9310 |
@mrdoob You would ship this on npm as is, all modules included + a compiled global-namespace browser monolith (three.js). Whoever needs to use single parts of it uses tools to create bundles. Consider a structure like this:
index.js simply re-exports all modules and functions in one file: export ClassA from './src/classA';
export ClassB from './src/classB';
export ClassC from './src/classC'; So the end-user can npm install the lib and just use it without any further ado: // all exports from index.js will be under: mylib.ClassA, etc.
import * as mylib from 'libname':
// selected exports from index.js
import { ClassA, ClassC } from 'libname';
// or, specific modules
import ClassB from 'libname/src/classB' browser.js would be the only compiled part of the package. Usually transpiled to ES5 via Babel and exported into a global-namespace so it can be used as a script include. Rollup, Webpack, etc. can create this with ease. |
@mrdoob its been a wonderful ride 🚀 |
Browserify
Moving to this architecture has advantages and disadvantages. Please add your thoughts.
Note: this does not require three.js consumers to use browserify.
The text was updated successfully, but these errors were encountered: