-
-
Notifications
You must be signed in to change notification settings - Fork 35.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transparent objects brighten when using gammacorrectionshader #23019
Comments
The proper workflow is to render the scene in linear-sRGB color space and convert to sRGB color space as a final post-processing step. However, currently, three.js converts to sRGB color space in-line in the shader, prior to blending with the drawing buffer. This is fine if the material is opaque. But if the material is transparent, it is not correct. I would expect three.js to integrate a proper post-processing workflow into the engine at some point. /ping @bhouston |
We can also add a post processing step to convert a linear HDR RGB buffer to a LDR gamma corrected buffer. You need at least an FP16 render target for this to work well. It shouldn't really be hard -- I think all the code is in Three.js. This is actually how Clara.io worked back in 2014: https://clara.io/view/9e2c46ce-6ada-450c-8acd-c44b84de2a38 I guess when you say we need to integrate a proper post-processing workflow, @WestLangley, you are meaning out of the box. That makes sense now. I actually switched from the proper HDR FP16/FP32 render buffer approach in 2016, and I contributed that back to Three.js here #8250, because many mobile devices simply didn't have enough memory to handle it properly, especially Apple iPads of the era. The approach we switched to is what Three.js currently uses as the default workflow -- it was actually awesome and fast. It does have a few limitations though as pointed out in this bug report. Thus we could go back to this more traditional approach now that mobile hardware has caught up. |
And interesting @WestLangley and I had this exact same discussion back then: #8250 (comment) |
So to clarify, the upper part of the fiddle is actually too dark whereas the lower part (using post-processing) is the correct image. |
@Mugen87 did you delete the comment where you mentioned the framebuffer/texture format "SRGB8_ALPHA8_EXT"? Could we use this as the render target format? From a quick reading, assuming the browser does the blending correctly between two separate sRGB8_ALPHA8_EXT buffers, it would solve the issue no? |
Well, I was not sure about the details of my comment so I've deleted it^^. I think the goal should be to avoid all decodes/encodes and tone mapping in the material shaders. Avoiding the decodes should be easier to implement via changes like #23039. As mentioned in this thread, avoiding the output encoding and tone mapping in the materials means the renderer has to implement some sort of post-processing. I'm not yet sure how this would look like though (implementation-wise) . |
Note this has been brought up a few times over on the forum as well, so the current behavior does confuse people. The difference is also stronger the more transparent an object is - so there's not much difference with |
Not sure if I missed something but I tried updating the fiddle to use UnsignedByte, which should have enabled SRGB8_ALPHA but it seems to have no difference:
So when I've used things like a transparent vase, the post-processing makes it look like there's a light brightening the vase and makes seeing through the vase difficult so I think I'd actually prefer the upper picture, although I can see how the example I gave might look better. |
Yes. And according to KhronosGroup/WebGL#2474 (comment), blending will indeed work correctly when using However, I want to make clear that WebGL does not allow to define the color space of the default framebuffer. That means we have to perform the final linear-sRGB encode ourself in a custom pass. I've hacked around and verified that the components of the framebuffer's color attachment are actually https://raw.githack.com/Mugen87/three.js/dev19/examples/webgl_geometry_cube.html As you can see, the scene is too dark since I have removed the |
@Kangz Is it possible with WebGPU to configure the canvas context with a sRGB format? We are currently using
Tested with Chrome Canary 99.0.4819.0 on macOS. According to the spec The idea is to write linear color values into the buffer but the format automatically performs a linear to sRGB encoding. Like mentioned in #23019 (comment), it's not possible to configure the drawing buffer's color space in WebGL. |
CCing @kainino0x. The group wants to have a way to configure the webgpu canvas with sRGB but it's not clear whether it will be by having |
I am pretty sure that we will require applications to configure the canvas with, for example,
Unfortunately 2336 is still blocked in standardization but I'm certain we'll have a way to do this reinterpretation, so canvas can rely on it. (I don't want to add In the meantime, you should be able to create an intermediate |
I've updated Unfortunately, a separate render pass noticeably downgrades the performance. Examples that used to run at 60 FPS on my iMac now run at approx. 40 FPS. Here are a live links for comparisons: https://raw.githack.com/Mugen87/three.js/dev64/examples/index.html If there is no way to improve the performance, it seems we have to keep using the inline approach for tone mapping and the sRGB encoding for the drawing framebuffer. At least in context of WebGL. |
The slowdown is to be expected. I believe you now have a FP16 or FP32 linear render target right? FP16 may lead to some speed improvements over FP32. You could try that. Just using that much memory could slow things down a lot, even without considering the separate pass. And then a separate pass that reads this and writes to a sRGB output is another slow down. I would suggest seeing if you wrote to the main linear render target as RGBA8 and had the separate pass the same as it is, what would the frame rate be? I'm trying to separate out the cost of FP16, FP32 on the render time versus just the secondary pass architecture. |
I think it is reasonable to offer this separate pass mode as an option that people can enable. Most people do not need it, thus most people can live with the high speed all in one mode, and only those who can, can enable this higher quality mode after the fact. |
Yes, it's FP16. Testing with
Tested on an iMac with Chrome. |
So adding in the extra pass it reduces from 60FPS to 47FPS. That is pretty significant, more than I thought it would. I assume this isn't a new iMac, but rather an older one right? How old? What specific GPU/CPU? |
Yes, it is almost eight years old. Details:
|
Here are some thoughts and some solution on this issue. UPDATE: This is not a solution, mainly because it is based on the idea of blending sRGB components, while the proper 3D workflow is to blend linear-sRGB components. So the problem is that when rendering to the As far as I understand, the problem is related to blending and is described as follows:
Well done! But when the renderer renders the image onto the This workflow gives the correct result if all materials are opaque (no blending occurs between objects), but if there are some transparent objects, then obviously this will not give the same (correct) result. The Is it right? Anyway, there is one way to solve this problem: https://codesandbox.io/s/issue-23019-0wjd8n?file=/src/index.js Use So the current solution is to blend objects in So it takes an extra pass for the decoding, but this is hardly critical when using an @Mugen87 how bad is this solution?) Thanks! |
@iuriiiurevich I believe you've understood why EffectComposer works differently, but may have missed @WestLangley's comment above – the behavior with EffectComposer is correct, it's the default behavior in three.js that is (for most purposes) wrong. I'm not sure there's anything we can change here... the 'inline' approach is dramatically cheaper, and it would be hard to justify removing the approach entirely. EffectComposer behaves correctly, and is available when the cost is acceptable. I would also point out that pmndrs/postprocessing has done a good thing in combining postprocessing passes — color space transforms, tone mapping, and other effects can be done in a single additional pass rather than two, which would be cheaper than implementing with three.js own EffectComposer today. I see @Mugen87 already combined two passes in his demo above. tl;dr I would vote to close this. Per #23614 (comment) their might be value in an 'unlit' color management workflow that doesn't go through Linear-sRGB, which would mean alpha blending happens in sRGB as a side effect. Beyond that (which I'm not even sure about) there doesn't seem to be anything we can change. |
@donmccurdy As far as I understand, when a user is faced with the fact that the colors in Three.js (when using a Render Target) and, for example, in Photoshop are blended differently, there are two main options:
Is it correct? |
To clarify one more thing, what I'm referring to as "correct" here is the behavior of blending Linear-sRGB components, instead of blending sRGB components. Photoshop has an option to enable/disable that, "Blend RGB Colors Using Gamma" — the preferred behavior may be different when working with 2D graphics, photography, 3D rendering, etc. In a WebGL 3D renderer our goal is to blend colors in Linear-sRGB, and not sRGB1. Doing that requires post-processing, which can be expensive. Two cases:
We cannot change (2), and we can't force all users to choose post-processing, but either way no hacks are required. Users can choose EffectComposer+GammaCorrectionShader if the performance cost is acceptable. Some applications will need post-processing for other reasons. 1 To be fair, some developers do use three.js for working with 2D graphics and unlit workflows, and they might prefer blending in sRGB. That's fine, and I wish we could offer a simple toggle for this, but technically our hands are tied. |
Yes, looks like I really missed it and got a bit confused) So, if the current behaviour does confuse people, maybe this should be explained somehow in the Color management documentation? |
Yes, it's unfortunate that the Color Management documentation does not discuss post-processing at all currently. Personally I'm using https://github.com/pmndrs/postprocessing — it handles color management better, IMHO (and does handle blending the same way). So I've been delaying on documenting the built-in post-processing color management. |
Minor update: This is how the OP's fiddle should look like with the upcoming sRGB and color management are enabled by default since Next step: #26129 |
As pointed out in #23019 (comment) and #23019 (comment), this issue can be closed. We might want to note the usage of the new |
Describe the bug
When using sRGB textures (such as in 3DS models), the GammaCorrectionShader seems to brighten the texture more than when the EffectComposer is not used. The GammaCorrectionShader is needed when using the EffectComposer to produce similar colors to when the EffectComposer is not used. I have not been able to upgrade past r111 because of this issue.
As a side note, a similar problem was seen with generic shapes with color but it appears that setting the clearColor on the RenderPass can help with that issue but not with textured objects.
To Reproduce
Live example
The following fiddle demonstrates the issue (top is no postprocessing, bottom is using postprocessing):
https://jsfiddle.net/s5tpxq1g/1/
You can see how the bottom is brighter for the texture and a lower opacity makes the issue more apparent.
Expected behavior
Colors match when using GammaCorrectionShader in an EffectComposer compared to not using EffectComposer.
Screenshots
Snapshot from fiddle (top half does not use EffectComposer, bottom half does). The left object shows that a normal object (with clearColor set correctly) does not have an issue but the right object, having textures, differs in brightness.:
Platform:
The text was updated successfully, but these errors were encountered: