Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Moving to a modular architecture #4776

Closed
kumavis opened this issue May 6, 2014 · 153 comments
Closed

Moving to a modular architecture #4776

kumavis opened this issue May 6, 2014 · 153 comments

Comments

@kumavis
Copy link
Contributor

kumavis commented May 6, 2014

Browserify
Moving to this architecture has advantages and disadvantages. Please add your thoughts.

Note: this does not require three.js consumers to use browserify.

@kumavis
Copy link
Contributor Author

kumavis commented May 6, 2014

One advantage is that this would enforce a modular architecture for the ongoing development of three.js.

The common style in node/browserify has each file declare its dependencies at the top, and considers global variables an anti-pattern.

Here is an example snippet:

// src/geometry/BoxGeometry.js
var Geometry = require('./Geometry.js');
var Vector3 = require('../core/Vector3.js');
module.exports = BoxGeometry;

function BoxGeometry() {
  // ...
}

BoxGeometry.prototype = Geometry.prototype;

Another advantage is that consumers of three.js using browserify would be able to pick and choose the parts they want. They could just import Scene, BoxGeometry, PerspectiveCamera, and WebGLRenderer, get the dependencies for all those automatically ( Object3D etc ), and have a small bundle of javascript that supports just the feature set they want.

This could be done in a way that imposes no breaking changes. At the top level, we would export all classes we consider to be part of the standard package

// src/three.js
var THREE = { rev: 101 }
module.exports = THREE

THREE.Geometry = require('./geometry/Geometry.js')
THREE.BoxGeometry = require('./geometry/BoxGeometry.js')
// ...

note: I'm not exactly requiring the dependencies at the top in this example, because this file would almost exclusively be require statements.

Finally we would wrap that in a Universal Module Definition that detects if a module system (node/browserify, AMD) is in use, and if so exports it, or otherwise appends it to the global object ( window ).

Lets review:

  • enforces good modular style
  • allows three.js consumers using browserify to pick and choose functionality
  • no breaking changes

@kumavis
Copy link
Contributor Author

kumavis commented May 6, 2014

This would require replacing the build system, but the new one would be pretty straight forward.

@shi-314
Copy link

shi-314 commented May 6, 2014

Some other advantages:

  • You can structure your code
  • You can create / reuse modules without polluting the global namespace
  • You can build for production
  • You can debug more easily since every module has it's own file you don't need to search for the corresponding module in a large three.js file

@kumavis
Copy link
Contributor Author

kumavis commented May 6, 2014

@shi-314 I guess I'm a little confused, I feel like You can structure your code and You can build for production are things that you can do without the architectural shift? Are you talking about three.js source or things built using three.js?

@ghost
Copy link

ghost commented May 6, 2014

One practice that three.js uses that makes it tricky to use in commonjs environments is the use of instanceof: https://github.com/mrdoob/three.js/blob/master/src/core/Geometry.js#L82

This is because in an application you often end up with different versions of the same library in your source tree, so checking instanceof doesn't work between different versions of the same library. It would be good in preparation for a move to a commonjs module system to replace those instanceof checks with feature-checking behind a Geometry.isGeometry(geom) style interface.

@shi-314
Copy link

shi-314 commented May 6, 2014

@kumavis I am talking about things built in three.js. Let's say you want to create your own material with your shaders etc. At the moment you need to extend the global THREE object to stay consistent with the rest of the three.js code:

THREE.MeshMyCoolMaterial = function (...) { ... }

But if we had Browserify than you could do:

var MeshLambertMaterial = require('./../MeshLambertMaterial');
var MeshMyCoolMaterial = function (...) {...}

So your namespace stays consistent and you don't need to use THREE.MeshLambertMaterial and MeshMyCoolMaterial in your code.

And with You can build for production I basicly meant the same thing you mentioned: allows three.js consumers using browserify to pick and choose functionality.

ghemingway added a commit to ghemingway/three.js that referenced this issue May 6, 2014
This is a simple way to make THREE support browserify at a global level.  There is ongoing discussion on reworking the build system to directly take advantage of browersify for builds, but I don't think there is consensus on that yet.  See:
mrdoob#4776
@kumavis
Copy link
Contributor Author

kumavis commented May 6, 2014

@shi-314 thank you, that is more clear. That does impact my proposed general solution to deserialization consumer-defined classes:

// given that `data` is a hash of a serialized object
var ObjectClass = THREE[ data.type ]
new ObjectClass.fromJSON( data )

This is from my proposed serialization / deserialization refactor
#4621

@gero3
Copy link
Contributor

gero3 commented May 8, 2014

Performance shouldn't be affected by a change like this.

@mattdesl
Copy link
Contributor

This is a pretty huge change but I'm also in favour of it.

Some other major advantages:

  • You can use browserify's standalone option to generate a UMD build for you. No need to manually tinker with UMD wrappers.
  • The package can easily be consumed by users of browserify/NPM
  • Pulling in dependencies for threejs (like poly2tri, color-string, etc) becomes much easier
  • Modules that "don't really belong" in a rendering library (like vector/math libraries) can be pulled out as separate NPM modules and re-used for many other types of projects. One major benefit of this is that the individual modules have their own repository for bugs/issues, PRs, etc (cleaning up ThreeJS issues).
  • NPM will handle semantic versioning for us. e.g. we can push a breaking change in the threejs-vecmath without worrying about everyone's code breaking. And on the flip side, if we make a patch or minor release in a particular module, people consuming those modules will be able to get the changes automatically.
  • It makes "extras" like EffectComposer and various shaders easy to package and consume (imagine npm install threejs-shader-bloom)
  • As modules are pulled out, the final distribution size will start to get smaller and more application-specific. There will eventually be no need for different "builds types" since we will just require() the modules that our app is actually using.

To @mrdoob and the other authors; if you don't have much experience with NPM/Browserify I would suggest making a couple little projects with it and getting a feel for its "philosophy." It's very different from ThreeJS architecture; rather than big frameworks it encourages lots of small things.

@CharlotteGore
Copy link
Contributor

Another advantage of this approach is that there can be an ecosystem of open source, third party Three.JS modules, especially shaders, geometries, model loaders etc. Published through NPM or Github/Component which people can then easily just reference and use. At the moment stuff is shared by hosting a demo which people then 'view source' on. Three.JS deserves better!

I think one of the problems I have with Three.JS is how quickly code becomes incompatible with the current version of Three.JS. Another advantage of switching to something like this is being able to specify specific versions of bits of Three.JS would be very powerful and handy.

+1

@cecilemuller
Copy link
Contributor

+1 for a CommonJS/browserify architecture, it would make the core more lightweight and extensions would fit even if they come from third-parties

@erno
Copy link

erno commented Jun 5, 2014

Fragmenting three.js into little modules has a lot of costs as well. The current system allows pretty simple third party addons (witness eg. jetienne's THREEx modules). There's a lot to be said about the simplicity of the current setup, as long as the JS module systems are just wrappers around build systems.

Another way of minimizing build size is what ClojureScript does. They follow some conventions to allow Google's Closure compiler to do whole-program analysis and dead code elimination.

@repsac
Copy link
Contributor

repsac commented Jun 5, 2014

+1 for the unappreciated, and often overlooked, elegance of simplicity

@JosephClay
Copy link

+1

@mattdesl
Copy link
Contributor

mattdesl commented Jun 5, 2014

Fragmenting three.js into little modules has a lot of costs as well. The current system allows pretty simple third party addons (witness eg. jetienne's THREEx modules).

The idea here is that a UMD build would still be provided for non-Node environments. Plugins like THREEx would work the same way for those depending on ThreeJS with simple <script> tags.

The tricky thing will be: how do we require() a particular plugin if we are in a CommonJS environment? Maybe browserify-shim could help.

There's a lot to be said about the simplicity of the current setup, as long as the JS module systems are just wrappers around build systems.

ThreeJS's current plugin/extension system is pretty awful to work with, and far from "simple" or easy. Most ThreeJS projects tend to use some form of plugin or extension, like EffectComposer, or FirstPersonControls, or a model loader, or one of the other many JS files floating around in the examples folder. Right now the only way to depend on these plugins:

  • Download the current build of ThreeJS
  • Copy-paste the necessary files into your vendor folder
  • Wire up gulp/grunt tasks to concat and minify all the plugins you need; making sure to concat them in the correct order otherwise things will break. Manually maintain this list as you add more plugins.
  • Repeat steps 1 and 2 every time ThreeJS is updated; and then pull your hair out when you realize the new code is not backward-compatible

Now, imagine, with browserify you could do something like this:

var FirstPersonControls = require('threejs-controls').FirstPersonControls;

//more granular, only requiring necessary files
var FirstPersonControls = require('threejs-controls/lib/FirstPersonControls');

Those plugins will require('threejs') and anything else that they may need (like GLSL snippets or text triangulation). The dependency/version management is all hidden to the user, and there is no need for manually maintained grunt/gulp concat tasks.

@CharlotteGore
Copy link
Contributor

The tricky thing will be: how do we require() a particular plugin if we are in a CommonJS environment?

I've been using CommonJS for THREE.js projects for a bit now. It's a bit of a manual process, converting chunks of other people's code into modules and I don't think there'll be an easy way to avoid that for legacy code that isn't converted by the authors or contributors.

The important bit is that there's a module exporting the entire 'standard' THREE object, which can then be required by anything that wishes to extend it.

var THREE = require('three');

THREE.EffectComposer = // ... etc, remembering to include copyright notices :)

This has worked pretty well for me, especially as the project grows and I start adding my own shaders and geometries into their own modules etc.

As long as there's a 'threejs-full' or 'threejs-classic' npm package then this becomes a pretty viable way of working with old Three.js stuff in a CommonJS environment but I suspect this is pretty niche!

@smrjans
Copy link

smrjans commented Jun 6, 2014

+1
I believe once fragmented threejs modules are available in npm, plugin
developers will love to migrate to CommonJS env.
On Jun 5, 2014 9:19 PM, "Charlotte Gore" notifications@github.com wrote:

The tricky thing will be: how do we require() a particular plugin if we
are in a CommonJS environment?

I've been using CommonJS for THREE.js projects for a bit now. It's a bit
of a manual process, converting chunks of other people's code into modules
and I don't think there'll be an easy way to avoid that for legacy code
that isn't converted by the authors or contributors.

The important bit is that there's a module exporting the entire 'standard'
THREE object, which can then be required by anything that wishes to extend
it.

var THREE = require('three');
THREE.EffectComposer = // ... etc, remembering to include copyright notices :)

This has worked pretty well for me, especially as the project grows and I
start adding my own shaders and geometries into their own modules etc.

As long as there's a 'threejs-full' or 'threejs-classic' npm package then
this becomes a pretty viable way of working with old Three.js stuff in a
CommonJS environment but I suspect this is pretty niche!


Reply to this email directly or view it on GitHub
#4776 (comment).

@cecilemuller
Copy link
Contributor

It could also make the shaders made modular as well, e.g. using glslify. Even things like making an Express middleware that generates shaders on demand becomes easier then.

@mrdoob
Copy link
Owner

mrdoob commented Jun 15, 2014

Some months ago I moved frame.js to require.js and I finally understood how this AMD stuff works.

I still need to learn, however, how to "compile" this. What's the tool/workflow for generating a three.min.js out of a list of modules?

@shi-314
Copy link

shi-314 commented Jun 15, 2014

I prefer gulp.js as a build system with the gulp-browserify plugin. It's really easy to understand and the code looks cleaner than grunt in my opinion. Check this out: http://travismaynard.com/writing/no-need-to-grunt-take-a-gulp-of-fresh-air 😉

@zz85
Copy link
Contributor

zz85 commented Jun 16, 2014

some thoughts: (based on my limited experience with node, npm, browserify of course)

  1. i think node.js modules are great (that's npm, modules, and require()s)
  2. i think browserify is also great

that said, following the discussion on this thread, i'm not sure if everyone had the same understanding of browserify (browserify, commonjs, requirejs, amd, umd are somewhat related although they may not necessary be the same thing).

now if you may follow my chain of thoughts a little.

  1. JS is great, it run fast across browsers.
  2. wow, now JS runs on the server side too.
  3. that's node.js, it's cool, so let's code stuff in node.js
  4. but I don't wish to write/ can't write everything/ find something to use.
  5. no worries, now run npm install modules
  6. now require these cool modules so we can use them.
  7. works great!
  8. now wait, we have just wrote a whole bunch of stuff in JS that runs on node.js
  9. isn't js supposed in browsers? how do we make these code run there again?

There's where Browserify comes into the picture. Well, technically one can use requireJS in the browser. But you wish to bundle js files together without making too many network calls (unlike file system require()s which are fast). There's where Browserify does some cool stuff like static analysis to see which modules needs to be imported and creates build which are more optimized for your application. (There are limitations of course, it probably can't parse require('bla' + variable)) it can even swap out parts which requires an emulation layer for node.js dependent stuff. yeah, it generates a js build which i can now include in my browser.

Here are some of the stuff browserify can do https://github.com/substack/node-browserify#usage

Sounds like everything's great so far... but there are a few points I thought worth considering we move to a "browserify architectural"

  • there needs to be a shift in mindset for three.js developers (the require module system has to be used of course)
  • a compatibility layer can be built so three.js users can still use three.js the old way without reaping the modular benefits
  • to be able to produce optimized builds, three.js users would need to move over to the require system
  • the new build process would likely involve the browserify toolchain (currently we could use python, node.js, or simple copy and paste etc) or some requireJS tools.
  • if we would want three.js to be truly more modular, with versioning on each components, like say TrackballControls, we would need to split them out, and that may lead to fragmentation
  • that might lead to diversity too, however one strength of three.js currently seems like it is a centralized point of many extensions

So if we see this diversity and convenient module loading (mainly riding on the npm ecosystem) along with customized builds a great thing, then it might be worth a short is having a change in paradigm, refactoring code and changing our current build system.

@guybrush
Copy link
Contributor

@mrdoob some tools around browserify are listed here: https://github.com/substack/node-browserify/wiki/browserify-tools.

regarding the three.min.js, you would not use the minified code in your project. all you do is var three = require('three') in your project.js and then run browserify project.js > bundle.js && uglifyjs bundle.js > bundle.min.js. note: you still can ship minified code for <script src="min.js">.

i am currently wrapping three.js with

if ('undefined' === typeof(window))
  var window = global && global.window ? global.window : this
var self = window

and

module.exports = THREE

then i wrap extensions with

module.exports = function(THREE) { /* extension-code here */ }

so i can require it like that:

var three = require('./wrapped-three.js')
require('./three-extension')(three)

so this is not optimal, but i personally can actually live with it and think its not so bad - though @kumavis proposal would be a huge advantage.

but maybe it would make sense to fork three and put all the things in seperate modules just to see how it would work out.

also checkout http://modules.gl/ which is heavily based on browserify (though you can use every module on its own without browserify).

@mattdesl
Copy link
Contributor

@mrdoob @shi-314 gulp-browserify has been blacklisted in favour of just using browserify directly (i.e. via vinyl-source-stream).

Tools like grunt/gulp/etc are constantly in flux, and you'll find lots of differing opinions. In the end it doesn't matter which you choose, or whether you just do it with a custom script. The more important questions are: how will users consume ThreeJS, and how much backward-compatibility do you want to maintain?

After some more thought, I think it will be really hard to modularize everything without completely refactoring the framework and its architecture. Here are some problems:

  • All of the namespace code has to change to CommonJS exports/require. This is a pretty huge undertaking and there would be a lot of ugly ../../../math/Vector2 etc.
  • In an ideal world, the library would be fragmented, so three-scene would be decoupled from three-lights etc. Then you can version each package separately. This kind of fragmentation seems unrealistic for a framework as large as ThreeJS, and would be a pain in the ass to maintain.
  • If we aren't fragmenting the framework into tiny components, then semantic versioning will be a nightmare. A tiny breaking change anywhere in the framework would need a major version bump for the whole thing. And consuming the API would be pretty ugly: require('three/src/math/Vector2')

My suggestion? We consider two things moving forward:

  1. Start small; pull out a few essential and reusable features like Vector/Quaternion, color conversions, triangulation, etc. These things are good candidates for NPM since they are useful outside of the scope of ThreeJS. They can also have their own test suite, versioning, and issue tracking.
  2. When new code needs to be added to ThreeJS, like a new feature, or a dependency (e.g. poly2tri/Tess2), consider pulling it out as a separate module and depending on it via NPM.

I'd love to see everything modularized, but I'm not sure of an approach that's realistic for ThreeJS. Maybe somebody should some experiments in a fork to see how feasible things are.

@mrdoob
Copy link
Owner

mrdoob commented Jun 17, 2014

Thanks for the explanations guys!

What I fear is complicating things to people that are just starting. Forcing them to learn this browserify/modules stuff may not be a good idea...

@repsac
Copy link
Contributor

repsac commented Jun 17, 2014

Would have to agree with @mrdoob here. I, and a lot of colleagues, are not web programmers (rather VFX/animation TDs). Picking up WebGL and Three has certainly been enough work as is on top of our current workload (and in some cases some of us had to learn js on the spot). Much of what I have read in this thread, at times, makes me shudder thinking about how much more work would be added to my plate if Three moved to this structure. I could be wrong but that is certainly how it reads to me.

@domenic
Copy link

domenic commented Jun 17, 2014

With a precompiled UMD (browserify --umd) build in the repo, there's no change to the workflow for existing developers.

@danieldelcore
Copy link
Contributor

+1 for moving to a modular architecture.

@langovoi
Copy link

+1

1 similar comment
@BrunoFenzl
Copy link

+1

@gionkunz
Copy link

@drcmda is right. ES6 modules have a initialization step and an execute step which allows circular references. However, as soon as you're having circular dependencies directly from the module execution context (in the global area of a module) then the first one loaded during runtime will experience undefined values for it's dependencies. As long as the references are used within different context where the runtime execution order matters, circular dependencies are no problem.

@fibo
Copy link

fibo commented Jun 28, 2016

I suggest to consider also webpack instead of browserify.

@kumavis
Copy link
Contributor Author

kumavis commented Jun 28, 2016

@gionkunz we have circular references in the initialization step bc of the pattern where theres a closure to generate scratch variables

@cecilemuller
Copy link
Contributor

The beta of Webpack 2 has just been released (https://twitter.com/TheLarkInn/status/747955723003322368/photo/1), so es6 modules could also benefit from tree shaking when bundled.

@drcmda
Copy link
Contributor

drcmda commented Jul 7, 2016

@mrdoob
Has there been a official statement more recently? Like many we have abandoned ES5 and glue-concats long time ago and it is quite bad how much THREE falls out of line in a modern build system. We use maybe 10% of what it can do, yet it is the biggest dependency we ship.

This is maybe the most favourite project on Github for me personally - i sincerely hope priorities will be re-considered.

  1. ES6 runs in all major browsers now
  2. Babel would still pack this into a namespace-monolith
  3. The rest of us could simply npm install three and use the parts we need
  4. Even import * as THREE from 'three' (index.js would contain re-exports of all modules) would work efficiently since both webpack and rollup support tree-shacking
  5. Re-structuring this into modules will have a tremendous beneficial effect for the code base as a whole

@mrdoob
Copy link
Owner

mrdoob commented Jul 7, 2016

Hmm, I would like to know more details about browser support. Which browsers do and which don't. For the browsers that don't, what are the workarounds and what are the performance penalties.

@christopherjbaker
Copy link

christopherjbaker commented Jul 7, 2016

Actually, browser support becomes a non-issue (perhaps even less so than it is now). The build systems take that ES6 code and transpile it to es5 (sometimes taking up less space than the original ES5 would have). Certain kinds of transpiled things end up being large (primarily: generators and async functions), but if you avoid those, you won't have that penalty.

As @drcmda mentioned, the build system would still produce a monolithic output (and would be very easy to customize exactly what is included in that output), but the individual modules could also be included in our own projects, thus only using the parts that we need. To take full advantage of
that, inter-dependencies need to be adjusted, but that can happen over time. I think the main features we want is to have it modularized with import/export. From your point of view, it would enable the use of classes over prototypes (they still use prototypes under the hood, so you can still
mess with it as necessary).

There are a few build systems. My vote would be webpack (which uses babel for the transpiling). With babel, you can define custom loaders, so the chunking system you developed for shaders could be reduced to actual glsl code with a #include extension (I actually do my shaders this way, and would be happy to contribute it to the project). This gets the same benefits of your system (no code duplication), but is very simple to use.

I would love to be part of the modularization project, but I do know that this will not be successful in any way without your support (and likely assistance). Many of us know how to use the library, but none of us know how it works internally to the extent that you do.

@mrdoob
Copy link
Owner

mrdoob commented Jul 7, 2016

Certain kinds of transpiled things end up being large (primarily: generators and async functions), but if you avoid those, you won't have that penalty.

How large?

@mrdoob
Copy link
Owner

mrdoob commented Jul 7, 2016

Also, you didn't talk about performance penalty. Is that not an issue then?

@mrdoob
Copy link
Owner

mrdoob commented Jul 7, 2016

As far as I can see, ES6 Imports are still not supported by any browser, so this module refactor would mainly be for build systems, right?

@peteruithoven
Copy link

Don't forget about the benefits you get by using tools like rollupjs, this would automatically exclude all the exports a user doesn't use. (Which is default with JSPM)

@christopherjbaker
Copy link

The babel-polyfil package, which is only necessary if you are using
generators (which probably don't even make sense in this project) or async
functions (which I don't really think would change much in the project
either), adds around 50k to the final build. But again, this is optional.

As far as performance, it really depends on exactly which features you are
using. For instance, arrow functions are a tiny bit slower, due to the
underlying bind, classes are a bit slower to create, though the
instantiating time is the same. https://kpdecker.github.io/six-speed/

ES6 imports/exports are not supported by browsers, but since it goes
through a build system, that is a non-issue. The product output would be
usable exactly as it currently is (even being backwards compatible), but
would allow it to be integrated into our build systems, and make the
internal components reusable to us.

Another thing to note is final build size. Currently, things like Geometry,
Material, Mesh, etc are part of the THREE namespace. When minified,
references to THREE.Geometry, THREE.Material, THREE.Mesh etc remain in the
code. With a modular system, each of those files would get something like
var Geometry = require('./geometry'); then have references to the
variable Geometry later. Then at minifaciton, Geometry and require
are both switched to singe characters, the './geometry' is replaced with a
number, resulting in quite a bit of savings. Napkin math: the minified
build is 511,794 bytes and contains 2942 references to
THREE\.[A-Z][a-zA-Z]+. Replacing all of these with a single character
results in an almost 10% file size reduction (down to 464,782). (The gziped
sizes are 117,278 and 110,460 respectively, a 6% reduction). The build
could likely be tuned to reduce this even further.

Rollup (which eliminated unused code from a final build) is the default
with jspm, will be the default with webpack2 (and I believe it can be used
with webpack). If things are written modularly, I don't think this will be
helpful, though. In any case, as long as the code can be transpiled with
babel, it can be used in any build system (the glsl loader I mentioned
before can also be made to work with webpack).

On Thu, Jul 7, 2016 at 1:28 PM, Mr.doob notifications@github.com wrote:

Certain kinds of transpiled things end up being large (primarily:
generators and async functions), but if you avoid those, you won't have
that penalty.

How large?


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
#4776 (comment),
or mute the thread
https://github.com/notifications/unsubscribe/AA71cqAqmgxsUjpvamnI_xyL2wpzeWrdks5qTWGBgaJpZM4B4aA7
.

@milcktoast
Copy link
Contributor

Not sure if this is terribly helpful, but this is the discussion thread for D3 regarding the same issue: d3/d3#2220. D3 4.0 has adopted ES6 import/export to manage modules, but is still written in ES5 (d3/d3#2220 (comment)).

@mrdoob
Copy link
Owner

mrdoob commented Jul 8, 2016

Very interesting @jpweeks!

So... with this import/export approach... How would stuff like object instanceof THREE.Mesh look like?

@GGAlanSmithee
Copy link
Contributor

@mrdoob

import/export is just the way modules are declared and required. It will not affect/change the code defined within the modules at all:

src/Objects/Mesh.js

// Mesh class, stays the same as today (except the export part)
var Mesh = function ( geometry, material ) {
    // ...
}

export default Mesh

src/Three.js

// Library entry point, exports all files using som bundling tech
// In a "THREE" namespace for browsers
// As import three from 'three' in node
import Mesh from './objects/Mesh'

export {Mesh} // All three objects, such as Geometry, Material etc..

Application.js

// In node
import {Mesh} from 'three'

var mesh = new Mesh(geo, mat)

console.log(mesh instanceof Mesh) // true

Client.js

// In a browser
var mesh = new THREE.Mesh(geo, mat)

console.log(mesh instanceof THREE.Mesh) // true

@mrdoob
Copy link
Owner

mrdoob commented Jul 8, 2016

That's super helpful @GGAlanSmithee! Thanks!

I'm a visual person so pseudo-code examples convince me more than big chunks of text 😅

Right, so it will require a bit of refactoring...

Does anyone know if closure compiler is planning on supporting this?

@Rich-Harris
Copy link
Contributor

Right, so it will require a bit of refactoring...

I got you! Since this thread got lively over the last couple of days I've been working a bit more on three-jsnext. It's a project that takes the existing Three.js codebase and turns it into ES modules automatically. Just wrangling with a couple of tricky cyclical dependencies (particularly around KeyframeTrack), but should have something to share real soon. As far as I can tell, all the examples continue to work, and the minified build is smaller than the current one (using Rollup to generate a UMD file), so it's all good news.

@Rich-Harris Rich-Harris mentioned this issue Jul 9, 2016
@Rich-Harris
Copy link
Contributor

Ok, I've opened a pull request for this: #9310

@drcmda
Copy link
Contributor

drcmda commented Jul 9, 2016

@mrdoob
We have a library in production that more or less is structured like THREE. It works in browsers and modular environments. The codebase is ES6 but browsers are not your concern at all.

You would ship this on npm as is, all modules included + a compiled global-namespace browser monolith (three.js). Whoever needs to use single parts of it uses tools to create bundles.

Consider a structure like this:

/src
    classA.js
    classB.js
    classC.js
/index.js
/browser.js

index.js simply re-exports all modules and functions in one file:

export ClassA from './src/classA';
export ClassB from './src/classB';
export ClassC from './src/classC';

So the end-user can npm install the lib and just use it without any further ado:

// all exports from index.js will be under: mylib.ClassA, etc.
import * as mylib from 'libname':

// selected exports from index.js
import { ClassA, ClassC } from 'libname';

// or, specific modules
import ClassB from 'libname/src/classB'

browser.js would be the only compiled part of the package. Usually transpiled to ES5 via Babel and exported into a global-namespace so it can be used as a script include. Rollup, Webpack, etc. can create this with ease.

@mrdoob
Copy link
Owner

mrdoob commented Jul 22, 2016

#9310

@mrdoob mrdoob closed this as completed Jul 22, 2016
@kumavis
Copy link
Contributor Author

kumavis commented Aug 14, 2016

@mrdoob its been a wonderful ride 🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests