-
Notifications
You must be signed in to change notification settings - Fork 7.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WebGL: RenderTexture frames do not render correctly #6057
Comments
May be #6017 |
Have spent quite a while on this and tested and ruled out the following approaches so far: 1 - Using the WebGL 2 - Manipulating the UVs during rendering (i.e. in the pipelines). This is possible, but in testing, it conflicted with the 'crop' feature (essentially breaking it entirely). So it could maybe be used as an edge-case, but it also feels wrong because of the cost of doing this literally every single frame, for every single sprite using the RT as its texture. 3 - Inverting the Frame coordinates when adding a frame to a Texture. I'm in two minds over this approach. It works quite well but is illogical and also suffers from being inconsistent between Canvas and WebGL (i.e. reading back a Frame coordinates between the two would be opposite values). On the plus side, it's a one-time solution, so you don't need to do it during rendering. Another approach I'm considering, but haven't yet tested, is simply letting you flag a RenderTexture as being "not used for display", i.e. you cannot add the Render Texture Game Object itself to the display list. If we do this, then anything drawn to the RT can be drawn inversed, which solves everything and doesn't require inverted Frame coordinates. Because if the base Render Texture is 'upside down', then any Sprite using a Frame from it will just work perfectly from the get go. I think I'll have to try this next. Thinking about it more, as I write this, it could be possible to still display the RT in the game, but have it flipped when it is rendered. |
@photonstorm Do you have any update on the timeline for a fix for this issue? Thanks! |
Spent some more time on this today - and getting close to a fix. Just trying to make it universal, which isn't easy. |
Ok, a fix for this is now in the master branch. Use it as follows: const origin = this.add.renderTexture(0, 0, 256, 256).setIsSpriteTexture(true);
origin.draw('mario', 0, 0);
origin.saveTexture('test-texture'); |
Version
Description
Adding and referencing frames on a
RenderTexture
produces visual errors, depending on how that RT frame is rendered. This only occurs in WebGL mode.The following demonstrates the expected behavior, using Canvas mode. Each quadrant/square is a separate frame on the original render texture.
The following demonstrates the exact same code, but in WebGL mode:
With some fiddling, it's apparent that there is some issue with Y-flipping; If I adjust the Y-coords of the frames*, I'm able to get some frames to render - albeit incorrectly flipped:
* ex:
rt.texture.add('my-frame', 0, x, y, frameWidth, textureHeight - frameHeight);
- Note thetextureHeight -
adjustmentI believe there are two issues at hand here (based on my current understanding of the Phaser codebase):
Example Test Code
This codepen reproduces this issue - note the
USE_WEBGL
flag up top to easily change the current rendering mode.https://codepen.io/andymikulski/pen/oNoyBGy
The gist, in psuedocode, is really:
Additional Information
I have dug around the code and noticed that checking if a texture is a
glTexture
and y-flipping accordingly seems to work, but this requires touchingMultiPipeline
,Blitter
, etc., and I wonder if there is a more 'central' fix available.I'm happy to work on these solutions, but I wanted to get some sort of input before putting in the effort! Thank you!
The text was updated successfully, but these errors were encountered: