You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One thing that seems to be missing in Iced is the ability to draw a custom 3d viewport. Even when the user creates their own integration (like in the integration_wgpu example), there doesn't seem to be a way to handle rendering an image that is already on the GPU and which the user already has a texture handle for.
This would enable something like this, where part of the UI renders a 3d scene of some kind.
The difference between this and the integration_wgpu example is that this allows the 3d viewport to be integrated as simply another widget: It handle its own layout and it would be able to have widgets both above and below it as normal. Also consider this image where the output of some shader is being drawn inside a widget:
My suggestion here, would be to add an additional primitive type Primitive::NativeTexture that simply stores a texture handle in whatever format the rendering backend is currently expecting, and lets the user do their own rendering.
This would enable powerful workflows where the 3d content is integrated into the GUI, and would be a good solution to issues like #343, at least for advanced users that don't mind messing with the native rendering API.
This looks like it would be quite easy to implement for wgpu, so I'd be happy to contribute something that works for that first, and then maybe we can look into doing a similar thing for OpenGL later?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
One thing that seems to be missing in Iced is the ability to draw a custom 3d viewport. Even when the user creates their own integration (like in the
integration_wgpu
example), there doesn't seem to be a way to handle rendering an image that is already on the GPU and which the user already has a texture handle for.This would enable something like this, where part of the UI renders a 3d scene of some kind.
The difference between this and the
integration_wgpu
example is that this allows the 3d viewport to be integrated as simply another widget: It handle its own layout and it would be able to have widgets both above and below it as normal. Also consider this image where the output of some shader is being drawn inside a widget:Looking at the code here, it seems like it should be pretty easy to do. The code does a bunch of extra steps to upload image data and create a
TextureView
, and from there, images are rendered as textures.https://github.com/iced-rs/iced/blob/master/wgpu/src/image.rs#L281-L446
My suggestion here, would be to add an additional primitive type
Primitive::NativeTexture
that simply stores a texture handle in whatever format the rendering backend is currently expecting, and lets the user do their own rendering.This would enable powerful workflows where the 3d content is integrated into the GUI, and would be a good solution to issues like #343, at least for advanced users that don't mind messing with the native rendering API.
This looks like it would be quite easy to implement for
wgpu
, so I'd be happy to contribute something that works for that first, and then maybe we can look into doing a similar thing for OpenGL later?Beta Was this translation helpful? Give feedback.
All reactions