Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose transmission texture creation to enable improved quality #22731

Closed
wants to merge 2 commits into from

Conversation

haxiomic
Copy link
Contributor

@haxiomic haxiomic commented Oct 25, 2021

Related issues: #22729 #22009

Description

Currently the transmission render target can have quality issues because:

  • its dimensions are uncoupled from the current viewport (fixed to 1024x1024), leading to asymmetrical warping
  • it uses NearestFilter for magnification, leading to aliasing artefacts

A fix for the dimension issue isn't trivial because in WebGL1 power of two dimensions are required for mipmap sampling. Additionally if multiple passes use different viewports, we require different dimensions of transmission targets. While is possible to fully resolve these issues, the resolutions introduce complexity

In the PR I've moved creation of the transmission render target to a separate function exposed on WebGLRenderer. This allows users to override this function and implement their own transmission texture handling to best suit their application

An added benefit would be to enabling users to customize transmission roughness levels, which are currently implemented with a simple box filter (generateMipmaps). For example, users may want to replace this with a gaussian filter for improved realism

@WestLangley
Copy link
Collaborator

@Mugen87 Can you comment on the advisability of this?

(I added a milestone so this does not get lost again.)

@Mugen87
Copy link
Collaborator

Mugen87 commented Feb 9, 2022

I don't think we should expose the render target creation. I suggest to provide a solution in the engine that should satisfy most use cases:

  • Set the size of the transmission render target to half of the drawing buffer's size. That should be a good approximation and solve the aspect ratio issue. Adding something like below in renderTransmissionPass() should also handle resize scenarios:
_this.getDrawingBufferSize( _vector2 ).multiplyScalar( 0.5 ).floor();
_transmissionRenderTarget.setSize( _vector2.x, _vector2.y );
  • Stop WebGL 1 support for MeshPhysicalMaterial.transmission to easier implement the above solution.
  • Using LinearFilter for magFilter should be safe, too.

@mrdoob
Copy link
Owner

mrdoob commented Feb 9, 2022

@Mugen87 Your suggestion sounds good to me! 👍

@Mugen87
Copy link
Collaborator

Mugen87 commented Feb 10, 2022

Closing in favor of #23450.

@Mugen87 Mugen87 closed this Feb 10, 2022
@Mugen87 Mugen87 removed this from the r138 milestone Feb 10, 2022
@haxiomic
Copy link
Contributor Author

Going against the grain here, but that'd be a step backward from my pov – WebGL1 is the primary target we work with so it'd be a shame to lose transmission there. Getting the aspect ratio to match the drawing buffer helps quality in the simple case but we're often making multiple passes of a scene at different resolutions

I'd rather have a 1024 x 1024 buffer that worked everywhere than something canvas size bound and limited to WebGL2

But better still would be some way to access the transmission buffer, doesn't need to be documented or explicitly public however

@mrdoob
Copy link
Owner

mrdoob commented Feb 10, 2022

WebGL1 is the primary target we work with so it'd be a shame to lose transmission there

Do you mind sharing some details of why is WebGL1 your primary target?

And what's your approach for getting good results with WebGL1? Maybe that's something we can add in the renderer instead.

But better still would be some way to access the transmission buffer, doesn't need to be documented or explicitly public however

We prefer to not add new methods whenever we can to avoid more maintenance work.

@haxiomic
Copy link
Contributor Author

Do you mind sharing some details of why is WebGL1 your primary target?

Projects are usually mobile-first and often clients focus on iOS for minimum spec requirements where WebGL2 support is still lagging sadly
Screenshot 2022-02-10 at 16 29 04

And what's your approach for getting good results with WebGL1? Maybe that's something we can add in the renderer instead.

Locally I've used

var transmissionSize = renderer.capabilities.isWebGL2 ? targetSize : floorPOT(targetSize);

The problem with separate camera passes requiring separate transmission sizes is what motivates exposing this so the user can fix problems but I totally get what you're saying about reducing maintenance surface

The perfect solution would be three.js would allocate transmission buffers as needed with dimensions corresponding to the render targets dimensions (and clearing them if not used in the next frame for example), I use a renderTargetStore approach to manage this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants