Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MeshPhysicalMaterial transmission doesn't work correctly in WebXR immersive mode #21911

Closed
takahirox opened this issue May 28, 2021 · 5 comments

Comments

@takahirox
Copy link
Collaborator

Describe the bug

We improved MeshPhysicalMaterial transmission support in #21884. But transmission doesn't work correctly in WebXR immersive mode yet.

For transmissive objects, we use multi passes. First we render opaque objects to a render target and then render transmissive objects with the render target.

The problem in VR immersive mode is the current transmission shader code doesn't expect stereoscopic vision so the calculated coordinates in the shader for the render target will be wrong.

The problem in AR immersive mode is we render the objects on camera but the camera isn't rendered to the render target so the transmissive objects can't be transparent to the camera (especially transparent = false MeshPhysicalMaterial).

To Reproduce

Steps to reproduce the behavior:

  1. Go to edit a WebXR examples to use MeshPhysicalMaterial with transmission = 1
  2. Open the example
  3. Enter immersive mode
  4. See the transmission doesn't work correctly

Expected behavior

transmission works correctly even in WebXR immersive mode
A clear and concise description of what you expected to happen.

Screenshots

image

Platform:

  • Device: Any WebXR devices
  • OS: Any
  • Browser: Any
  • Three.js version: dev
@takahirox
Copy link
Collaborator Author

takahirox commented Jun 6, 2021

For AR, I read through WebXR AR spec again and realized that there doesn't seem to be a way to capture camera stream used for AR immersive mode and render the stream to texture (render target).

WebXR Raw Camera Access API seems to be proposed but it seems under the discussion.

For VR, the performance is very important for better experience. But transmission is costly.

So, for AR/VR immersive mode how about using the stub we had until r128?

https://github.com/mrdoob/three.js/blob/r128/src/renderers/shaders/ShaderLib/meshphysical_frag.glsl.js#L113-L116

#ifdef TRANSMISSION
	diffuseColor.a *= mix( saturate( 1. - totalTransmission + linearToRelativeLuminance( reflectedLight.directSpecular + reflectedLight.indirectSpecular ) ), 1.0, metalness );
#endif

It can resolve the problems I mentioned above. Of course, it's just a transparency using alpha. For example no refraction support. But maybe better than broken.

@elalish
Copy link
Contributor

elalish commented Jun 7, 2021

Agreed, better than broken. We can iterate on it from there.

@takahirox
Copy link
Collaborator Author

@mrdoob What do you think of this temporal workaround?

@mrdoob mrdoob modified the milestones: r130, r131 Jun 30, 2021
@mrdoob mrdoob modified the milestones: r131, r132 Jul 28, 2021
@mrdoob mrdoob removed this from the r132 milestone Aug 26, 2021
@mrdoob
Copy link
Owner

mrdoob commented Aug 26, 2021

@mrdoob What do you think of this temporal workaround?

I've just tried it and it doesn't seem to produce good results...

Screen Shot 2021-08-26 at 9 32 51 AM

@mrdoob
Copy link
Owner

mrdoob commented Aug 26, 2021

#22426 fixed VR and #22425 kind of fixed AR

@mrdoob mrdoob closed this as completed Aug 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants