title | description | author | ms.author | ms.date | ms.topic | keywords |
---|---|---|---|---|---|---|
Native development overview |
Learn how to build a DirectX-based mixed-reality engine using the Windows Mixed Reality APIs directly. |
thetuvix |
alexturn |
08/04/2020 |
article |
DirectX, holographic rendering, native, native app, WinRT, WinRT app, platform APIs, custom engine, middleware, mixed reality headset, windows mixed reality headset, virtual reality headset |
3D engines like Unity or Unreal aren't the only Mixed Reality development paths open to you. You can also create Mixed Reality apps using the Windows Mixed Reality APIs with DirectX 11 or DirectX 12. By going to the platform source, you're essentially building your own middleware or framework.
Important
If you have an existing WinRT project that you'd like to maintain, head over to our main WinRT documentation.
Use the following checkpoints to bring your Unity games and applications into the world of mixed reality.
Windows Mixed Reality supports two kinds of apps:
- UWP or Win32 Mixed Reality applications that use the HolographicSpace API or OpenXR API to render an immersive view that fills the headset display
- 2D apps (UWP) that use DirectX, XAML, or another framework to render 2D views on slates in the Windows Mixed Reality home
The differences between DirectX development for 2D views and immersive views primarily concern holographic rendering and spatial input. Your UWP application's IFrameworkView or your Win32 application's HWND are required and remain largely the same. The same is true for the WinRT APIs that are available to your app. But you must use a different subset of these APIs to take advantage of holographic features. For example, the system for holographic applications manages the swapchain and frame present to enable a pose-predicted frame loop.
Windows Mixed Reality applications use the following APIs to build mixed-reality experiences for HoloLens and other immersive headsets:
Feature | Capability |
---|---|
Gaze | Let users target holograms with by looking at them |
Gesture | Add spatial actions to your apps |
Holographic rendering | Draw a hologram at a precise location in the world around your users |
Motion controller | Let your users take action in your Mixed Reality environments |
Spatial mapping | Map your physical space with a virtual mesh overlay to mark the boundaries of your environment |
Voice | Capture spoken keywords, phrases, and dictation from your users |
Note
You can find upcoming and in-development core features in the OpenXR roadmap documentation.
You can develop on a desktop using OpenXR on a HoloLens 2 or Windows Mixed Reality immersive headset. If you don't have access to a headset, you can use the HoloLens 2 Emulator or the Windows Mixed Reality Simulator instead.
A developer's job is never done, especially when learning a new tool or SDK. The following sections can take you into areas beyond the beginner level material you've already completed. These topics and resources aren't in any sequential order, so feel free to jump around and explore!
If you're looking to level up your OpenXR game, check out the links below: