-
-
Notifications
You must be signed in to change notification settings - Fork 35.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Post-processing with WebXR #18846
Post-processing with WebXR #18846
Conversation
…fferSize() in EffectComposer for VR
* Update EffectComposer to use WebGLRenderer.xr instead of WebGLRenderer.vr * Leave render target unchanged if some passes swap buffers * Update three.js build
Please remove all build files from your PR. Also undo the changes to It seems #15840 can be closed? |
I guess on reason why #15840 was never merged is the performance aspect of post-processing with WebXR. Has somebody actually verified the performance of Right now, I feel making |
At Samsung Post processing in XR is something we are actively working with and currently have to maintain our own versions of THREE to support it, it's more that a technical showcase for us. @wizgrav has been doing the most work with it and has been working with the Oculus quest. |
@Mugen87 here's a demo of the home brew post proc suite @AdaRoseCannon mentioned https://samsunginter.net/three-effects/examples/madrid/ Even though it's coalescing the blending parts of effects in a single ubershader, It is not running at full frame rate on the quest but it has several effects running and complex geometry as well. Check it out and judge for yourself. I agree that ssao doesn't make sense but basic effects like bloom are reasonable. I also have to point out that effect composer is inefficient and should not be used in production even for higher spec platforms than the quest and even for non vr scenarios as well. It's wasting a full pass per effect to do trivial operations that can be batched like eg adding the bloom on the final image. This is the old approach unity had, which it has now abandoned in favor of using an ubershader to blend the results of the various effects prepasses |
Ok, done. Sorry, I assumed the build files had to be up-to-date with the source.
If you think my PR is on the right track, yes.
We are also using it professionally, hence the motivation for my changes. :) Yes, Oculus Quest is the device we use the most. Some effects are simple enough to work well in VR, and it is often possible to adjust parameters / disable some passes, based on the device. |
Um, I've tried to test your link with Oculus Quest and Oculus Browser (8.2.1.x). However, when entering XR, most of the screen is black. I can only see the scene in a small part of the viewport. The official |
Hi, just checking if this is merged or will be merged at any point ? |
I personally don't think so. A more promising/performant approach is summed up here: #18846 (comment) |
What about wizgrav's comment:
|
I'm not sure the project should promote doing post-processing in VR with So instead of making the existing |
Combining passes into a few passes for the performance sounds good to me. So we need a shader combine engine. Do you think we can reuse Node based material system for it? (Sorry, I'm still not familiar with our node based material system yet). Or do we need to create a new one? |
Would it be worthwhile to look into a stop-gap solution of providing an efficient method for a few popular features like bloom, ssao and colour grading then looking for a more general solution further down the road? |
Is there any progress on that approach? |
as a user coming to use XR in threejs for the first time, my expectation has been that my code should work the same across XR and non-XR the glitch effect (from examples) performs well on my android using EffectComposer and from what I’ve seen, performance on mobile in and out of XR is comparable. imo, if code is not performant, it should be up to the user to decide if to use it or not or to find an alternative or choose which devices to support etc. i imagine others like myself that are familiarising themselves but following the three.js examples and would expect for the EffectComposer to work. |
It's worth noting that https://github.com/pmndrs/postprocessing/ has already done this. I've implemented a few custom post-processing effects with that library, and I really appreciate how @vanruesc has set things up, my effects got merged with others fine. I'm not sure how many effects are WebXR-compatible yet, but it can be done: https://twitter.com/Cody_J_Bennett/status/1482585611781480448 I'm worried we may be doing users a disservice by exclusively promoting the internal post-processing implementation, while so much good development is happening in the community version at this point. Maybe we could add 1-2 examples using the pmndrs/postprocessing library, with an "external" tag? |
I decided to complete the fix in #26160 to champion this issue. I've planned to do the same for pmndrs/postprocessing, but only now just got to getting it to work since the introduction of WebXR layers now years ago. My tweet Don mentioned gives me full control for eye-dependent effects which we can do more effectively since WebXRManager exposes XR internals. I'm personally very happy about that since I don't need to eject from three.js to curate advanced behavior. |
I need this integration for a lens distortion effect for webxr. For Iphone due to the webxr polyfill being completely broken with three. I ran tests with StereoEffect and only get one side rendering. I checked out the fix and merged updated dev changes. And XR processing is working. I'll try and integrate lens distortion. Using this project |
We have merged #26160 instead but had to revert for performance reasons, see #26160 (comment). |
I picked up takahirox's work to support post-processing in VR (issue #15840) to make it up to date with the latest WebXR.