-
-
Notifications
You must be signed in to change notification settings - Fork 21.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement external texture support for XR plugins that require render… #51179
Implement external texture support for XR plugins that require render… #51179
Conversation
@m4gr3d just tagging you here, still early days but this is going to be important for Quest support. |
Glad to see the beginning of this!! Btw, just a thought but I'm wondering if we should come up with a better name than |
Totally agree with this, the name has been confusing for some time. I'll give it some thought but am very open to suggestions because so far I can't think of any good alternative either :) |
How do you envision support WebXR once we do have OpenGL/WebGL? The methods still use "texture" in the name, and, of course, in OpenGL/WebGL you can render things to a texture so that would probably be the expectation for those methods. However, there is no way to convert a framebuffer to a texture (without making a new texture and then copying that to the framebuffer, which we're doing in 3.x now, and it's slow) and so with WebXR we'd really need to actually refer to a framebuffer. |
Seeing it will be a completely separate renderer I think we'll have specific entry points for this, so I would say adding a It's too early to tell because we have no idea what this renderer will look like and whether we'll also support multiview here or if we'll ask for framebuffers for each eye :) |
Ok, so it'll be alright for us to add new virtual methods to the
WebXR can do multiview and it uses a separate viewport on the same framebuffer for each eye. But I guess it'll depend on if the OpenGL renderer will support multiview. |
Godot 4 already breaks all compatibility with Godot 3 so this is the time to do so. My worry is that the opengl renderer won't be in before 4.1 but even then I think we can get away with adding aslong as we don't take anything away.
exactly, we'll get there when we get there and then make the right decision :) |
This needs to be either integrated into #52003 or adjusted once that PR is merged |
9a7d43e
to
188dd24
Compare
188dd24
to
978ea05
Compare
978ea05
to
bf3983c
Compare
bf3983c
to
99aa682
Compare
adf77ad
to
437ca54
Compare
32418fd
to
b1c92af
Compare
…ing into special buffers.
b1c92af
to
2d532a1
Compare
I'm sure this PR is super out-of-date at this point, and will probably need to be rebuilt in a new PR, but I just wanted to post some research notes regarding WebXR. I've been looking into how multiview can be implemented with WebXR, and to looks like using the new-ish WebXR Layers spec (which is supported in Chrome and the Oculus Browser, plus there is a polyfill to add emulation of this API to browsers that don't yet support it) we can get WebXR to give us two 2D texture arrays (one for the color buffer and another for the depth-stencil buffer) and the texture index for each "view" (which for most cases is a left and right eye). Of course, assuming I'm understanding it right, I've only read the spec, I haven't tried to actually use it yet. But what I think we need from a WebXR / OpenGL perspective is a way to setup a render target to point at these external 2D texture arrays (and possibly have some way to say which view is for which index? WebXR gives us the texture index for each view, and so it's possible they could maybe not line up with the view number, which would be annoying, but since the spec doesn't dictate that they must be same, assuming they are the same could lead to the left/right eye being swapped or something). |
This is superseded by #65227 |
For a number of XR plugins (namely WebXR, Oculus and OpenXR) the XR runtime supplies us with textures that we need to render into. Especially on mobile platforms such as the Quest we'll want to do so as direct as possible.
This PR makes it possible to supply an external texture for the viewport, and an external depth texture for 3D rendering. In doing so Godots own texture objects are not created.
Do note that there is still a copy involved as the 3D render buffer is still created by Godot however in combination with the new subpass logic this should handle things really nicely on a device like the Quest (or so we hope). This will require implementation of VRS to fully nullify the performance penalty as the render buffers we create do not support foveated rendering.
Also note that for WebXR we're actually not getting textures but we're getting an WebGL framebuffer and this is something we probably can't support until we have stereoscopic OpenGL/WebGL support in Godot 4 so this is not part of this PR.
With OpenXR, OpenXR has its own API for creating swapchains to render into that do not just setup the Vulkan
VkImage
to render into, but enables various other XR specific features. As a result it can't use images created by Godot.This now also applies for 2D viewports. OpenXR has a feature for compositing layers including 2D layers in 3D space. This bypasses the lens distortion used for the 3D render resulting in more legible text. I still need to add support in somehow to make it possible to set an alternative destination for this.
In many of these use cases, having Godot render into it's own framebuffers and then copying the end result will void most of the optimizations required for XR and is not a viable path forward.
There is one outstanding TODO in this PR, we're often using a swapchain managed by the XR runtime, meaning each frame is rendered into a different texture in a 1-2-3-1-2-3-etc style round ribbon approach. This means updating the framebuffers and recreating shared texture objects. This needs a better implementation by updating existing framebuffers and maybe adding a map so our shared texture objects once created for each texture, are re-used.