Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

usability of swapchain in a native context #88

Closed
crowlKats opened this issue May 20, 2021 · 19 comments
Closed

usability of swapchain in a native context #88

crowlKats opened this issue May 20, 2021 · 19 comments
Labels
presentation Presenting images to surfaces like windows and canvases

Comments

@crowlKats
Copy link

At Deno, we are currently working on adding swapchain support which would allow to render to a window. Problem is that Deno strives to be browser compatible, but that is difficult as usage of swapchain wasn't designed for a native context.

@Kangz
Copy link
Collaborator

Kangz commented May 20, 2021

It would help to be more precise. Can you describe what parts are a problem for you? Is it the explicit present? The extra present mode argument? The view vs texture difference?

@crowlKats
Copy link
Author

My bad. So the issue is that
a) The view vs texture difference, but on top of that that the view is optional, whereas in the spec the texture is always returned
b) the whole swapchainstatus thing

present mode is not a problem really as we would just go with fifo as thats the safest option.

@Kangz
Copy link
Collaborator

Kangz commented May 25, 2021

@kvark there seems to be a bunch of wgpu-isms there as there's no "swapchainstatus" in this header and the view isn't optional unless I'm missing something.

@kvark
Copy link
Collaborator

kvark commented May 31, 2021

Right, so that's what the issue is about: agreeing on the API between wgpu and Dawn. Unlike WebGPU, in webgpu-native the presentation is expected to go through the actual native API presentation primitives. This means, it can fail. Presenting in Vulkan can result it a number of errors, and even when it succeeds, it may carry a hint for the user to re-create the swapchain. If this isn't exposed to the user of webgpu-header, how is the implementation supposed to handle these situations?

@Kangz
Copy link
Collaborator

Kangz commented Jun 1, 2021

We discussed the shape of the swapchain API in #13 but that was a while ago, and Dawn didn'r even catch up completely to that proposal yet (we don't have it for the GL backend). My understanding is that the implementation should do whatever it takes to present the swapchain to the screen, including resizing or a copy. That's inefficient but it makes things robust.

Let's take advantage of this issue to redesign the swapchain API in light of the experience you have in wgpu (we had almost no feedback for the swapchain in Dawn). What would be the ideal shape in your opinion? (it could be in Rust and we'll figure out how to lower to C).

@kvark
Copy link
Collaborator

kvark commented Jun 9, 2021

My understanding is that the implementation should do whatever it takes to present the swapchain to the screen, including resizing or a copy.

I vehemently disagree with this. WebGPU on native needs to be able to expose the portable subset of native swapchains for presentation. Doing an implicit copy/resize kills the performance expectations (and perf portability) for the user.

@Kangz
Copy link
Collaborator

Kangz commented Jun 10, 2021

I agree that portable performance is really important. But portable behavior in the face of various underlying APIs, compositor architectures, etc is also really important. There's too much developer time spent in the world spend on figuring out how to avoid flickering content on window resize or baroque linux window managers (I've had to handle that at least 3 different times for ANGLE).

This is where I think a "status" flag that says "OK / SUBOPTIMAL / LOST" on the swapchain (or along with the current texture) makes sense: applications should make sure they have an OK flag in the regular case, but it's ok for them to be SUBOPTIMAL and still work during things like resize, or on weird configurations if they don't want to adapt.

@tjpalmer
Copy link

Recent WebGPU IDL seems to have dropped references to swapchain entirely. Have I misunderstood this? If this is correctly, does webgpu.h plan to mirror the standard API? I guess I'm trying to understand the relationship, and I'm very new to looking at webgpu.

@kvark
Copy link
Collaborator

kvark commented Aug 11, 2021

Yes, "webgpu.h" needs to catch up to this change and remove swap chains as a concept.

@Kangz
Copy link
Collaborator

Kangz commented Aug 17, 2021

I'd like to wait until we're sure of the shape of the swapchain stuff in WebGPU before re-designing it for webgpu.h. This is one of the areas with the most churn at the moment.

@unicomp21
Copy link

unicomp21 commented Mar 19, 2022

What about just encoding the image buffers into a webrtc videostream via the chrome canvas element?

@Kangz
Copy link
Collaborator

Kangz commented Mar 21, 2022

This seems completely unrelated? The WSI already gives a way to render to a canvas, and WebRTC can source from a canvas.

@unicomp21
Copy link

@Kangz I'm confused. Isn't video encode a gpu function these days? Like vertex shaders, fragment shaders, and gpu compute? Video encode is also a part of frameworks like Unity and UnrealEngine, right? Why is video encode unrelated?

Per @kvark and removing swap chains, I agree. And video encode could be a step in this direction?
#88 (comment)

@Kangz
Copy link
Collaborator

Kangz commented Mar 21, 2022

WebGPU doesn't expose video encoding capabilities, and doing it is not part of the medium term roadmap. Encoding videos is not related to swapchains either because in all native APIs, when you encode a video, you pass any texture (with the correct format) and it's orthogonal to swapchains.

On the Web specifically you can put the webgpu canvas in WebRTC and it should work, but in native no such capability is designed for that.

@unicomp21
Copy link

Is the canvas implementation efficient? Will it offload to hardware?

What about the long term roadmap?

@Kangz
Copy link
Collaborator

Kangz commented Mar 21, 2022

That's browser-specific, and not linked to webgpu-headers. For Chromium specifically I think video is hardware encoded, for some formats, on some OSes but don't know the details. Long term roadmap for webgpu-headers is ???

@unicomp21
Copy link

unicomp21 commented Mar 21, 2022

Long term roadmap for webrtc video encode as part of webgpu-native? Seems to me, games made for streaming from a gpu server, would find this useful?

@kainino0x
Copy link
Collaborator

This was fixed in #203, according to #197.

@kainino0x
Copy link
Collaborator

(If I'm wrong, please comment or reopen; if there are other things in this thread that I missed, please open a new issue)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
presentation Presenting images to surfaces like windows and canvases
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants