-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
usability of swapchain in a native context #88
Comments
It would help to be more precise. Can you describe what parts are a problem for you? Is it the explicit present? The extra present mode argument? The view vs texture difference? |
My bad. So the issue is that present mode is not a problem really as we would just go with fifo as thats the safest option. |
@kvark there seems to be a bunch of wgpu-isms there as there's no "swapchainstatus" in this header and the view isn't optional unless I'm missing something. |
Right, so that's what the issue is about: agreeing on the API between wgpu and Dawn. Unlike WebGPU, in webgpu-native the presentation is expected to go through the actual native API presentation primitives. This means, it can fail. Presenting in Vulkan can result it a number of errors, and even when it succeeds, it may carry a hint for the user to re-create the swapchain. If this isn't exposed to the user of webgpu-header, how is the implementation supposed to handle these situations? |
We discussed the shape of the swapchain API in #13 but that was a while ago, and Dawn didn'r even catch up completely to that proposal yet (we don't have it for the GL backend). My understanding is that the implementation should do whatever it takes to present the swapchain to the screen, including resizing or a copy. That's inefficient but it makes things robust. Let's take advantage of this issue to redesign the swapchain API in light of the experience you have in wgpu (we had almost no feedback for the swapchain in Dawn). What would be the ideal shape in your opinion? (it could be in Rust and we'll figure out how to lower to C). |
I vehemently disagree with this. WebGPU on native needs to be able to expose the portable subset of native swapchains for presentation. Doing an implicit copy/resize kills the performance expectations (and perf portability) for the user. |
I agree that portable performance is really important. But portable behavior in the face of various underlying APIs, compositor architectures, etc is also really important. There's too much developer time spent in the world spend on figuring out how to avoid flickering content on window resize or baroque linux window managers (I've had to handle that at least 3 different times for ANGLE). This is where I think a "status" flag that says "OK / SUBOPTIMAL / LOST" on the swapchain (or along with the current texture) makes sense: applications should make sure they have an OK flag in the regular case, but it's ok for them to be SUBOPTIMAL and still work during things like resize, or on weird configurations if they don't want to adapt. |
Recent WebGPU IDL seems to have dropped references to swapchain entirely. Have I misunderstood this? If this is correctly, does webgpu.h plan to mirror the standard API? I guess I'm trying to understand the relationship, and I'm very new to looking at webgpu. |
Yes, "webgpu.h" needs to catch up to this change and remove swap chains as a concept. |
I'd like to wait until we're sure of the shape of the swapchain stuff in WebGPU before re-designing it for |
What about just encoding the image buffers into a webrtc videostream via the chrome canvas element? |
This seems completely unrelated? The WSI already gives a way to render to a canvas, and WebRTC can source from a canvas. |
@Kangz I'm confused. Isn't video encode a gpu function these days? Like vertex shaders, fragment shaders, and gpu compute? Video encode is also a part of frameworks like Unity and UnrealEngine, right? Why is video encode unrelated? Per @kvark and removing swap chains, I agree. And video encode could be a step in this direction? |
WebGPU doesn't expose video encoding capabilities, and doing it is not part of the medium term roadmap. Encoding videos is not related to swapchains either because in all native APIs, when you encode a video, you pass any texture (with the correct format) and it's orthogonal to swapchains. On the Web specifically you can put the webgpu canvas in WebRTC and it should work, but in native no such capability is designed for that. |
Is the canvas implementation efficient? Will it offload to hardware? What about the long term roadmap? |
That's browser-specific, and not linked to webgpu-headers. For Chromium specifically I think video is hardware encoded, for some formats, on some OSes but don't know the details. Long term roadmap for webgpu-headers is ??? |
Long term roadmap for webrtc video encode as part of webgpu-native? Seems to me, games made for streaming from a gpu server, would find this useful? |
(If I'm wrong, please comment or reopen; if there are other things in this thread that I missed, please open a new issue) |
At Deno, we are currently working on adding swapchain support which would allow to render to a window. Problem is that Deno strives to be browser compatible, but that is difficult as usage of swapchain wasn't designed for a native context.
The text was updated successfully, but these errors were encountered: