-
-
Notifications
You must be signed in to change notification settings - Fork 35.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ability to Sort Faces / Handle Transparency Sorting #4724
Comments
No. I don't think |
Some comments:
|
Btw, in our project we render leaves with |
Pretty good points brought up. However, it was probably the artefacts in large alpha-tested planes which had cause me to go down this rabbit hole. Also, @tuan_kuranes on twitter had brought up the possible use of Weighted, Blended Order-Independent Transparency which have also been implemented in webgl |
The weighted blended technique is nice because it scales well and is not so difficult to implement. In the end, like most computer graphics techniques, it's just a hack/approximation and whether it works depends on the scene. As mentioned above, for us, essentially using a 1-bit alpha channel (leaf texels are either fully transparent or fully opaque) and then using a simple alpha test worked fine. I'd be surprised if you saw any artifacts in the procedural tree demo with this approach. I tried to test it, but the demo uses three.js r40, which does not support alphaTest on materials. All being said, I'd be interested if you get better results with the weighted blended technique. And even though I'm not sure it is the best approach for rendering vegetation, it might be useful for rendering other transparent objects, like particles. |
By the way, if you're interested in rendering trees, you might find this interesting: http://http.developer.nvidia.com/GPUGems3/gpugems3_ch04.html |
Yeah, you're right. Actually alpha-testing opaque planes work well for the simple tree examples. (Not sure how I opened the can of transparency worms). I probably agree with @mrdoob decision too not to sort faces in the renderer, even though it could be a simple approach to handling transparent order. It could be done by the user, although it probably requires some matrices calculation and that's when a util class would be handy (if this becomes a popular use case). As for the weighted-blended technique, I might try implementing it when I have the time. Its 4 lines of shader code, but would be a good chance to test multi-render targets in the WebGLRenderer (even though multi-pass approach should also work). Won't know how well it works until we really try it, but the vegetation demo video used with the weighted blended OIT seems to be pretty nice. http://www.youtube.com/watch?v=41dD2OsUagI |
Some days ago I thought it shouldn't be difficult to render a tree with three.js (yes btw, it wasn't easy to search for trees with three.js because google thinks you are searching for three.js anyways:)
Turns out, handling transparency isn't something trivial at all as Z-buffer don't play well with transparent textures/materials (not just webgl/three.js but for opengl too). Many such questions surface on stackoverflow and usual solutions would be along the lines of
material.depthWrite = false
,material.depthTest = false
,material.alphaTest = 0.5
,material.blending = bla
etc.What three.js WebGLRenderer does behind the scene is that it first renders opaque objects followed by transparent objects (
material.transparent = true
). This seems to be the correct approach, and few simple transparent objects would render well with the opaque objects. However, with many (and intersecting) transparent planes (like the leaves) more problems start to surface.If depth and alpha tests are enabled, the transparent portions of front leaves over-rides leaves in the back.
If depth tests are not enabled, some leaves from the back are rendered above the leaves in front. An example of this problem can also be found in inear's procedure three.js tree example
So I conclude that disabling depth writes & tests for transparent materials are useful for order-independent transparency blending modes (which are limited to multiplication and additive blending modes).
So back to the solutions of fixing the problem of rendering the transparent leaves correctly, it seems like that could be a cpu/shader side approaches. One shader approach is call "Depth Peeling" or a newer approach called Dual Depth Peeling which basically seems to write a special depth buffer taking into account alpha values and using the new depth and referencing the original z-buffer and new depth buffers when doing the shading.
A simpler approach is to render the transparent faces from the furtherest to the nearest. At the moment, there doesn't seems to be a way to do that. So I basically split each face into a new geometry+mesh, but the painterly sort only sorts object by its position property. And so I normalize the new single-face geometry objects by offsetting the average of the 3 vertices in the geometry, and set the position of that mesh to that average. The material settings are set to
transparent: true & depthWrite: false
. Well, that kind of work, (apart from a tiny bit of z-fighting) but creates a whole tons of new geometries, meshes and objects (and that probably defeats the purpose of the new Geometry classes).So what I'm wondering is this - does it make sense for the WebGLRenderer to have the ability to sort faces (perhaps based on the centroid of its vertices), which is kind of the equivalent of
.sortParticles = true
forParticleSystem
, implement a depth peeling algorithm or just leave three.js as it is for now?Ok. looks like I've chattered too much now (this seems it could be a blog post than an issue, but hopefully this brings up some points that could be useful for others in similar situations :P)
The text was updated successfully, but these errors were encountered: