-
Notifications
You must be signed in to change notification settings - Fork 22.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WebXR guide material tracking issue #7276
Comments
Great list. Here's some feedback from Oculus:
Feedback from developers:
More commentary: Positional audio isn't integrated with WebXR (it probably should be) so maybe that is a topic to address last/later. Playing video in a 3D environment is a good application of Layers, so it could be efficient to work on those together. The reason Layers are good for 3D video is that even if the page doesn't submit WebXR frames on time, the VR compositor can update the video texture so playback is smooth; and using a Layer means the video looks sharper because it is only sampled once, by the VR compositor, instead of once by the application to produce the scene and again by the VR compostor. This is not only good for low latency and visual quality, it is also more power efficient. Incidentally, we wrote a Layers polyfill; should provide some ideas for supporting UAs which don't implement Layers. There is kind of an "input" theme with input profiles, game controllers, targeting and hit detection. We would like to see coverage of the Hands input profile. Performance would be a good topic to cover. MDN has some great WebGL performance content. On top of that there's some WebXR specific aspects: Foveated rendering (rendering more cheaply at lower resolution in the periphery of the view), refresh rates (there's some documentation here but I don't think it covers requesting low and high refresh rates, the refresh rate API, and what to do in practice to monitor and manage refresh rate), multiview. Footnote, isFirstPersonObserver is an interesting case... The compat table says this is supported in Chrome 86, but if you look at the implementation it is unconditionally false. With my web developer hat on it would be useful to know that so I can manage my implementation effort. (I don't know what Samsung Internet's level of support for isFirstPersonObserver is.) From the Browser vendor's perspective the most important thing is authors render all of their views and don't hard code support for, say, just two views so that if first person observer views light up later sites won't break. |
Another article could be about XR accessibility. See https://w3c.github.io/apa/xaur/ |
WebXR Guides are particularly important. WebGL is very stateful and it is hard to inspect the state. Adding WebXR on top of that, where you don't precisely control your inputs like pose, sensed depth, etc. it can be very hard to understand if you're working from an API reference alone. Guides are important. There's been great momentum on WebXR docs in Q3. We should capitalize on that. |
(updated given Dominic's comment and more own thoughts)
Filing this bug to collect what kind of guide/tutorial content is missing in the WebXR docs. Planning to share this with subject matter experts and hopefully get some help with completing these pages
Landing pages
Existing tutorials
New tutorials to write
Positional audio in a 3D environment (not integrated into WebXR, address this topic later, remove from landing page)The text was updated successfully, but these errors were encountered: