Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding a true LightProbe Object3D type? #6251

Closed
bhouston opened this issue Mar 18, 2015 · 27 comments
Closed

Adding a true LightProbe Object3D type? #6251

bhouston opened this issue Mar 18, 2015 · 27 comments

Comments

@bhouston
Copy link
Contributor

Right now people can sort of create a light node by creating a cube camera and then manually rendering it. I was wondering if it may be worthwhile at some point to create a light probe node type based on Object3D like UE4 has. It would contain I guess both a cube camera and an HDR cube map texture, as well as some logic for an update strategy (either once or some other frequency.)

I think that for a light probe to be truely useful, it will need something like an exclusion list -- a set of objects that should not be visible when updating the light probe. This is useful for the case of a light probe situated exactly at a location of a key reflective object.

I am a bit confused on how light probes are used by materials in UE4 in the end. Does each object just look for the nearest LightProbe and use that one as its environment map or does it interpolate between the nearest few light probes? Or do you manually assign light probes to a material?

I can view manual assignment of light probes to a material as useful in the case of something like a moving car, have a light probe on the car that is located exactly where the car is.

@mrdoob
Copy link
Owner

mrdoob commented Mar 18, 2015

As far as I understand light probes the renderer finds the closest ones and interpolates. The exclusion list is something that I would like to implement in this cycle. There is a half done PR waiting somewhere...

@pailhead
Copy link
Contributor

It interpolates, but that in itself could be a bit of a nightmare to set up?

http://gdcvault.com/play/1015312/Light-Probe-Interpolation-Using-Tetrahedral

@bhouston
Copy link
Contributor Author

Neat! That makes a lot of sense.

I've written tetrahedral interpolation before and have some code for that left over -- https://cs.uwaterloo.ca/~c2batty/papers/Batty10/ . It isn't that hard to do actually. But then I guess we need to have 4 light probe cube maps available to each shader -- I wonder if that is too many uniforms? I'll have to check the WebGL capabilities on low end devices to see if that is feasible. The main thing is that we would need a 3D tetrahedral or delaunay mesh generator.

@WestLangley
Copy link
Collaborator

@bhouston If you represent the light probe with spherical harmonics, then interpolation results in another set of spherical harmonic weights. The interpolation is performed on the CPU side. The shader only needs to support a single SH probe.

http://blogs.unity3d.com/2011/03/09/light-probes/
http://blogs.unity3d.com/2011/06/08/advanced-shading-and-lighting-for-mobile/


On a related note, a quote from the previous link reads:

The light probes can be used to store the full incoming lighting or just the indirect lighting

In our case, if they represent indirect lighting only, then they would be subject to an aoMap -- just like ambientLight is.

@donmccurdy
Copy link
Collaborator

donmccurdy commented Nov 19, 2018

I'm interested in this proposal. Would an incremental path to supporting light probes and spherical harmonics be something like this?

1. Implement material.sphericalHarmonics or mesh.sphericalHarmonics

A SphericalHarmonics class might have a few methods such as fromArray, toArray, addLight, clear, and interpolate.

mesh.sphericalHarmonics = new THREE.SphericalHarmonics().fromArray( [ ... ] ); // (a)
material.sphericalHarmonics = new THREE.SphericalHarmonics().fromArray( [ ... ] ); // (b)

2. Implement THREE.LightProbe and THREE.LightProbeGroup

Enable light probes for a scene to be grouped as a spatial structure, so that spherical harmonics for each mesh can be efficiently computed with tetrahedral interpolation or some other method.

var probe1 = new THREE.LightProbe( sh1 );
probe1.position.set( ... );
var probe2 = new THREE.LightProbe( sh2 );
probe2.position.set( ... );
// ...

var probeGroup = new THREE.LightProbeGroup()
  .addProbe( probe1 )
  .addProbe( probe2 )
  // ...

renderer.setAnimationLoop( function () {

  probeGroup.update( mesh1 );
  probeGroup.update( mesh2 );
  renderer.render( scene, camera );

} );

3. Realtime calculation of spherical harmonics for light probes

???


With only (1) implemented, SH lighting can be used via tools like octagen by @TimvanScherpenzeel to compute SH coefficients from environment maps, or perhaps generate them in Blender. With the addition of (2), baked SH lighting is responsive as dynamic objects move throughout a scene. I don't know whether (3) is actually feasible, but there appears to be justifiable value in (1) and (2) alone.

A nice illustration from the GDC vault:

screen shot 2018-11-19 at 7 49 00 am

@bhouston
Copy link
Contributor Author

Myself and WestLangley wrote an SH that was never merged. It is in the PRs.

@bhouston
Copy link
Contributor Author

See:

#6405
#13115 -- probably just needs to be cleaned up a little bit.

@bhouston
Copy link
Contributor Author

It is possible to compute a SH from just adding light contributions from a envmap using the addAt function I believe if you weight your samples properly. I think that @WestLangley wrote code for this already in a PR but I can not find it. But basically it should be possible to do this on the GPU. Thus cubemap -> SH should be possible purely in the GPU. Maybe @richardmonette could help with this as he wrote the latlong->cubemap converter that ran on the GPU, this is actually quite similar.

@WestLangley
Copy link
Collaborator

I can do this if you make clear what API you want and whether is it CPU-based or GPU-based.

@bhouston
Copy link
Contributor Author

CubeMap to SH must happen on the GPU for sure. I swear someone wrote this already. Or maybe I am just remembering we discussed this: #6405 (comment)

But we do need SH in Three.JS proper, and @WestLangley would be a hero for pushing this to completion.

@WestLangley
Copy link
Collaborator

@bhouston I have something somewhere... but there needs to be agreement on the feature set and API.

@donmccurdy Sorry, mesh.sphericalHarmonics and material.sphericalHarmonics as concepts do not make sense to me.

It seems to me what you are looking for is THREE.SHLight, which is specified by a location and its SH coefficients. It could also have a GPU-based method .setFromCubeMap().

An SHLight is just a generalization of AmbientLight and HemisphereLight. It would be indirect-diffuse-only.

You could name it THREE.LightProbe if you want, I guess, but I am not sure if that nomenclature is accurate.

@richardmonette
Copy link
Contributor

Looking quickly, I found a JS implementation but it appears to be CPU based (whereas I think we want to do this on the GPU) https://github.com/nicknikolov/cubemap-sh

@donmccurdy
Copy link
Collaborator

donmccurdy commented Nov 19, 2018

I'm not sure that thinking of SH as a "light", e.g. THREE.SphericalHarmonicsLight, is the right API here. If I have 10 meshes in various parts of a scene, I almost certainly want different SH values for each. As @bhouston says in #13115:

I think SH is more useful when you calculate a lot of them as a GI solution and then interpolate between them as an object moves around an environment. SH in that context is very useful because it is very data efficient to store 100 of them on the GPU, where as storing 100 Cubemaps is just impossible.

I suggested [mesh|material].sphericalHarmonics because it seemed analogous to the .envMap API, and leaves us the option to (later) have higher-level abstractions for interpolating SH values around a scene (i.e. THREE.LightProbe), or leave that to userland code. But if SH is required to be global for the scene, we lose that option. I was hoping to design the API such that the three.js library understands SH, but the LightProbe abstraction could be external in examples/js. But other naming ideas would be welcome.

Thank you @WestLangley and @bhouston for the quick feedback! 🙂

@richardmonette
Copy link
Contributor

If I recall, one of the main virtues of SH is their ability to, in a sense, act as data compression on cube maps (at least when those cube maps represent indirect illumination.)

In practice I've seen this used where say 100 light probes (in the form of SH) are used across the level/map of a game. The advantage is that 100 light probes in SH is tractable to store, whereas 100 full cubemaps would simply be too much data storage.

At rendering time, the nearest N light probes are sampled and some distance-to-probe weighted average is used to compute the indirect illumination (and this is added to the direct illumination.)

Assuming my recollections hold, this would suggest the API for this (i.e. SH) should allow many SH lights/probes and the shader system should take into account some number of nearby probes when rendering.

In terms of where the SH probe data would come from - one could imagine pre-computing in a level editor or other tool, but certainly it would be convenient if T.js also had some faculties for doing the cubemap to SH conversion.

@bhouston
Copy link
Contributor Author

bhouston commented Nov 19, 2018

I agree with @donmccurdy and @richardmonette . We still need the basic functionality @WestLangley was talking about but we need to realize in a light probe fashion. Although just like our current singular envMap solution, one could have a singular material.envMapDiffuseSH value on a material for the simple case when testing.

@WestLangley
Copy link
Collaborator

If I have 10 meshes in various parts of a scene, I almost certainly want different SH values for each [mesh].

And what happens when the mesh moves or rotates if the SH coefficients are associated with the mesh or material?

Honestly, SHLight[Probe] is the only concept that makes makes sense to me. A scene can have many of them, as you described. Each probe records the irradiance or illuminance at a particular scene location.

We could start by supporting just one probe, and ignoring the location for now -- treating it like ambient light.

@bhouston
Copy link
Contributor Author

And what happens when the mesh moves or rotates if the SH coefficients are associated with the mesh or material?

The same thing that happens with material.envMap. Basically it is assumed to be in world space. material.envMap is a weird one, just like material.diffuseSH would be. But yeah, moving towards LightProbe is the best approach. In fact material.envMap should probably be removed at some point and be replaced by a singular LightProbe that has specular and diffuse. material.envMap should be viewed as a poor mans light probe, and really a stop gap until we had something proper.

@donmccurdy
Copy link
Collaborator

I do agree that THREE.LightProbe makes more conceptual sense than material.diffuseSH. Assuming multiple probes are supported (unlike #13115), either one should provide enough functionality. The main difference, as far as I can tell, is that if the user can't set SH values on objects manually then all the complexity of light probe interpolation must happen inside the renderer. By contrast, with material.diffuseSH the light probe abstraction remains outside the renderer, is more accessible to customization, and doesn't necessarily need to be in the core library:

lightProbeGroup
  .addProbe( probe1 )
  .addProbe( probe2 )
  .addProbe( probe3 )
  // ...
  .setInterpolation( new THREE.SHTetrahedralInterpolant() );

renderer.setAnimationLoop( () => {

  lightProbeGroup.updateDiffuseSH( mesh1 );
  lightProbeGroup.updateDiffuseSH( mesh2 );
  // ...

  renderer.render( scene, camera );

} );

If we think tetrahedral interpolation will be simple and robust enough, and that alternative interpolation probably isn't needed, then perhaps it's OK for that to be in a black box within the renderer. Or some configuration API could also be exposed on the renderer? I'm not sure if that's an important concern or not, as we add separate WebGL / WebGL2 renderers moving forward.

@bhouston
Copy link
Contributor Author

bhouston commented Nov 19, 2018

TEtrahedral interpolation is best case. I think you should start with radial basis interpolation with exponential falloff from the nearest N light probes to start -- simple and likely effective, especially if you throw the light probes into a simple KD tree or similar access structure.

@bhouston
Copy link
Contributor Author

LightProbes should technically be abstract enough to be SH or CubeMap based. This is a configuration one can specify. If you want to convert LightProbe cubemaps into SH and lose the specular information or not.

I think that the interface should be something like:

var lightProbeGroup = new THREE.LightProbeGroup();

// add some light probes to the scene
var diffuseLightProbe = new THREE.LightProbe( worldLocation ); 
lightProbeGroup.addProbe( diffuseLightProbe );

// pick a random probe to update each frame, rather than updating all on each frame.
// recomputes the cubemap from the scene and then optionally compresses it to a SH.
lightProbeGroup.updateProbe( probeIndex, renderer, scene );

// if an object moves this frame, update its applicable light probes and weights. 
// gets the indices for the applicable light probe and their weights.
lightProbeGroup.getProbeWeights( object.getWorldLocation(), out object.probeIndicesAndWeights );

Then I would upload to the GPU a data texture of all the SHs in a scene. Each object would reference a few of them and weight them appropriately to calculate their local diffuse GI lighting.

@mrdoob
Copy link
Owner

mrdoob commented Nov 19, 2018

@WestLangley

We could start by supporting just one probe, and ignoring the location for now -- treating it like ambient light.

Sounds good to me 👌

@donmccurdy
Copy link
Collaborator

donmccurdy commented Mar 12, 2019

I read through this thread again, and realized my last comment was a bit contrarian. I'm happy with the suggestion of LightProbe or LightProbeGroup as the API, instead of .sphericalHarmonics or .diffuseSH. Support for one light probe (or eventually multiple) with SH would be an excellent feature.

@richardmonette
Copy link
Contributor

I've been doing a bit more thinking around how LightProbe could be implemented (specifically, the precursor work of a SHGenerator, see #16152)

The tricky part seems to be finding the correct place in the render architecture to select the closest LightProbe and the lifecycle for updating the uniforms (and/or the texture samplers.)

We can actually think of AmbientLight as a 0th order SH. We then just need to change which AmbientLight is active depending on the position of the object being rendered.

If the scene has a single AmbientLight, this would essentially be the status quo, but if there were two (or more) ambient lights, then the ambient term would start to come from the closest light.

This would be a relatively easy change as, if I recall correctly, there is a global set of uniforms we could change per renderObject call. (This is a bit tricker when having to deal with materials not necessarily being 1:1 with objects, so I'm keen to try and make it work for AmbientLight, in the simplest case first.)

@donmccurdy
Copy link
Collaborator

donmccurdy commented Apr 4, 2019

Could I argue that either:

(a) The renderer should not choose the nearest probe, but should instead blend the nearest N probes.
(b) The renderer should expose some mechanism for the user to update SH coordinates for a particular mesh, if the above is too complex.

If we're going to support multiple light probes, I think it's essential that moving between them should give smooth transitions.

If a multi-probe abstraction exists, then perhaps updates should occur there (invoked by the user) rather than in the renderer:

function animate () {

  lightProbeGroup.update( [ mesh1, mesh2 ] );

  renderer.render( scene, camera );

}

@richardmonette
Copy link
Contributor

@donmccurdy What do you think of the implementation I am working on in #16199 ?

@donmccurdy
Copy link
Collaborator

@richardmonette thank you! Added some comments there.

@Mugen87
Copy link
Collaborator

Mugen87 commented Dec 28, 2019

The LightProbe class was added via #16191, SphericalHarmonics3 via #16187, LightProbeGenerator via #16295 and support in the renderer via #16223. Please discuss light probe interpolation in #16228.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants