-
-
Notifications
You must be signed in to change notification settings - Fork 35.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding a true LightProbe Object3D type? #6251
Comments
As far as I understand light probes the renderer finds the closest ones and interpolates. The exclusion list is something that I would like to implement in this cycle. There is a half done PR waiting somewhere... |
It interpolates, but that in itself could be a bit of a nightmare to set up? http://gdcvault.com/play/1015312/Light-Probe-Interpolation-Using-Tetrahedral |
Neat! That makes a lot of sense. I've written tetrahedral interpolation before and have some code for that left over -- https://cs.uwaterloo.ca/~c2batty/papers/Batty10/ . It isn't that hard to do actually. But then I guess we need to have 4 light probe cube maps available to each shader -- I wonder if that is too many uniforms? I'll have to check the WebGL capabilities on low end devices to see if that is feasible. The main thing is that we would need a 3D tetrahedral or delaunay mesh generator. |
@bhouston If you represent the light probe with spherical harmonics, then interpolation results in another set of spherical harmonic weights. The interpolation is performed on the CPU side. The shader only needs to support a single SH probe. http://blogs.unity3d.com/2011/03/09/light-probes/ On a related note, a quote from the previous link reads:
In our case, if they represent indirect lighting only, then they would be subject to an |
I'm interested in this proposal. Would an incremental path to supporting light probes and spherical harmonics be something like this? 1. Implement A mesh.sphericalHarmonics = new THREE.SphericalHarmonics().fromArray( [ ... ] ); // (a)
material.sphericalHarmonics = new THREE.SphericalHarmonics().fromArray( [ ... ] ); // (b) 2. Implement Enable light probes for a scene to be grouped as a spatial structure, so that spherical harmonics for each mesh can be efficiently computed with tetrahedral interpolation or some other method. var probe1 = new THREE.LightProbe( sh1 );
probe1.position.set( ... );
var probe2 = new THREE.LightProbe( sh2 );
probe2.position.set( ... );
// ...
var probeGroup = new THREE.LightProbeGroup()
.addProbe( probe1 )
.addProbe( probe2 )
// ...
renderer.setAnimationLoop( function () {
probeGroup.update( mesh1 );
probeGroup.update( mesh2 );
renderer.render( scene, camera );
} ); 3. Realtime calculation of spherical harmonics for light probes ??? With only (1) implemented, SH lighting can be used via tools like octagen by @TimvanScherpenzeel to compute SH coefficients from environment maps, or perhaps generate them in Blender. With the addition of (2), baked SH lighting is responsive as dynamic objects move throughout a scene. I don't know whether (3) is actually feasible, but there appears to be justifiable value in (1) and (2) alone. A nice illustration from the GDC vault: |
Myself and WestLangley wrote an SH that was never merged. It is in the PRs. |
It is possible to compute a SH from just adding light contributions from a envmap using the addAt function I believe if you weight your samples properly. I think that @WestLangley wrote code for this already in a PR but I can not find it. But basically it should be possible to do this on the GPU. Thus cubemap -> SH should be possible purely in the GPU. Maybe @richardmonette could help with this as he wrote the latlong->cubemap converter that ran on the GPU, this is actually quite similar. |
I can do this if you make clear what API you want and whether is it CPU-based or GPU-based. |
CubeMap to SH must happen on the GPU for sure. I swear someone wrote this already. Or maybe I am just remembering we discussed this: #6405 (comment) But we do need SH in Three.JS proper, and @WestLangley would be a hero for pushing this to completion. |
@bhouston I have something somewhere... but there needs to be agreement on the feature set and API. @donmccurdy Sorry, It seems to me what you are looking for is An You could name it |
Looking quickly, I found a JS implementation but it appears to be CPU based (whereas I think we want to do this on the GPU) https://github.com/nicknikolov/cubemap-sh |
I'm not sure that thinking of SH as a "light", e.g.
I suggested Thank you @WestLangley and @bhouston for the quick feedback! 🙂 |
If I recall, one of the main virtues of SH is their ability to, in a sense, act as data compression on cube maps (at least when those cube maps represent indirect illumination.) In practice I've seen this used where say 100 light probes (in the form of SH) are used across the level/map of a game. The advantage is that 100 light probes in SH is tractable to store, whereas 100 full cubemaps would simply be too much data storage. At rendering time, the nearest N light probes are sampled and some distance-to-probe weighted average is used to compute the indirect illumination (and this is added to the direct illumination.) Assuming my recollections hold, this would suggest the API for this (i.e. SH) should allow many SH lights/probes and the shader system should take into account some number of nearby probes when rendering. In terms of where the SH probe data would come from - one could imagine pre-computing in a level editor or other tool, but certainly it would be convenient if T.js also had some faculties for doing the cubemap to SH conversion. |
I agree with @donmccurdy and @richardmonette . We still need the basic functionality @WestLangley was talking about but we need to realize in a light probe fashion. Although just like our current singular envMap solution, one could have a singular material.envMapDiffuseSH value on a material for the simple case when testing. |
And what happens when the mesh moves or rotates if the SH coefficients are associated with the mesh or material? Honestly, We could start by supporting just one probe, and ignoring the location for now -- treating it like ambient light. |
The same thing that happens with material.envMap. Basically it is assumed to be in world space. material.envMap is a weird one, just like material.diffuseSH would be. But yeah, moving towards LightProbe is the best approach. In fact material.envMap should probably be removed at some point and be replaced by a singular LightProbe that has specular and diffuse. material.envMap should be viewed as a poor mans light probe, and really a stop gap until we had something proper. |
I do agree that lightProbeGroup
.addProbe( probe1 )
.addProbe( probe2 )
.addProbe( probe3 )
// ...
.setInterpolation( new THREE.SHTetrahedralInterpolant() );
renderer.setAnimationLoop( () => {
lightProbeGroup.updateDiffuseSH( mesh1 );
lightProbeGroup.updateDiffuseSH( mesh2 );
// ...
renderer.render( scene, camera );
} ); If we think tetrahedral interpolation will be simple and robust enough, and that alternative interpolation probably isn't needed, then perhaps it's OK for that to be in a black box within the renderer. Or some configuration API could also be exposed on the renderer? I'm not sure if that's an important concern or not, as we add separate WebGL / WebGL2 renderers moving forward. |
TEtrahedral interpolation is best case. I think you should start with radial basis interpolation with exponential falloff from the nearest N light probes to start -- simple and likely effective, especially if you throw the light probes into a simple KD tree or similar access structure. |
LightProbes should technically be abstract enough to be SH or CubeMap based. This is a configuration one can specify. If you want to convert LightProbe cubemaps into SH and lose the specular information or not. I think that the interface should be something like: var lightProbeGroup = new THREE.LightProbeGroup();
// add some light probes to the scene
var diffuseLightProbe = new THREE.LightProbe( worldLocation );
lightProbeGroup.addProbe( diffuseLightProbe );
// pick a random probe to update each frame, rather than updating all on each frame.
// recomputes the cubemap from the scene and then optionally compresses it to a SH.
lightProbeGroup.updateProbe( probeIndex, renderer, scene );
// if an object moves this frame, update its applicable light probes and weights.
// gets the indices for the applicable light probe and their weights.
lightProbeGroup.getProbeWeights( object.getWorldLocation(), out object.probeIndicesAndWeights ); Then I would upload to the GPU a data texture of all the SHs in a scene. Each object would reference a few of them and weight them appropriately to calculate their local diffuse GI lighting. |
Sounds good to me 👌 |
I read through this thread again, and realized my last comment was a bit contrarian. I'm happy with the suggestion of |
I've been doing a bit more thinking around how LightProbe could be implemented (specifically, the precursor work of a SHGenerator, see #16152) The tricky part seems to be finding the correct place in the render architecture to select the closest LightProbe and the lifecycle for updating the uniforms (and/or the texture samplers.) We can actually think of AmbientLight as a 0th order SH. We then just need to change which AmbientLight is active depending on the position of the object being rendered. If the scene has a single AmbientLight, this would essentially be the status quo, but if there were two (or more) ambient lights, then the ambient term would start to come from the closest light. This would be a relatively easy change as, if I recall correctly, there is a global set of uniforms we could change per renderObject call. (This is a bit tricker when having to deal with materials not necessarily being 1:1 with objects, so I'm keen to try and make it work for AmbientLight, in the simplest case first.) |
Could I argue that either: (a) The renderer should not choose the nearest probe, but should instead blend the nearest N probes. If we're going to support multiple light probes, I think it's essential that moving between them should give smooth transitions. If a multi-probe abstraction exists, then perhaps updates should occur there (invoked by the user) rather than in the renderer: function animate () {
lightProbeGroup.update( [ mesh1, mesh2 ] );
renderer.render( scene, camera );
} |
@donmccurdy What do you think of the implementation I am working on in #16199 ? |
@richardmonette thank you! Added some comments there. |
Right now people can sort of create a light node by creating a cube camera and then manually rendering it. I was wondering if it may be worthwhile at some point to create a light probe node type based on Object3D like UE4 has. It would contain I guess both a cube camera and an HDR cube map texture, as well as some logic for an update strategy (either once or some other frequency.)
I think that for a light probe to be truely useful, it will need something like an exclusion list -- a set of objects that should not be visible when updating the light probe. This is useful for the case of a light probe situated exactly at a location of a key reflective object.
I am a bit confused on how light probes are used by materials in UE4 in the end. Does each object just look for the nearest LightProbe and use that one as its environment map or does it interpolate between the nearest few light probes? Or do you manually assign light probes to a material?
I can view manual assignment of light probes to a material as useful in the case of something like a moving car, have a light probe on the car that is located exactly where the car is.
The text was updated successfully, but these errors were encountered: