Focus is a virtual and mixed reality Quest application that shows developers how to build a productivity app using Meta Spatial SDK.
Using Quest to extend spatial screens is one of the primary use cases in mixed reality, but there are infinite opportunities to create additional functionality with virtual objects and AI that make using Quest a compelling tool for work.
In order to build these applications, it is critical to be able to create spatial objects effectively, use persistence to store various layouts, spawn objects in coherent locations, and allow a user to intuitively interact with panels.
Focus app demonstrates panel and object management, enabling the creation, reuse, and destruction of interactive elements within varied environments.
Focus serves as a practical template for developers looking to create similar applications or integrate these features into their own XR projects.
Ensure you have Git LFS installed before cloning this repository. To install Git LFS run this command:
git lfs install
Download Android Studio and open the cloned repository using it.
Check that you are using the latest version of Meta Spatial SDK AARs. AAR files are included in the libs folder, beneath app in the Project view: YourProject > app > libs.
You can check Spatial SDK version between the Gradle dependencies as well, in build.gradle.kts.
When you plug in your Meta Quest headset, Android Studio recognizes it as an Android device. Once you have connected your Meta Quest device, click the green Run button in Android Studio to build and deploy your code to your headset.
Put on your headset to see your app running!
From an end user point of view, Focus includes the following elements:
Home Panel: where users can create a new project or open previous ones.
Settings Panel: configuration of the project (name, 3D environment selection or passthrough mode).
Task Panel: allow users to create tasks with a title, body and select between different priority states. E.g.: To do, High priority.
AI Exchange Panel: allows the user to chat with a virtual assistant, an artificial intelligence with whom to exchange ideas. Users can also create sticky notes with the answers of the virtual assistant.
Clock: shows time and date. Pretty useful when you are in an immersive space.
Speaker: a spatial audio speaker that the user can turn on/off and move around.
Toolbar: the main controller of the experience, in which users can open or close main panels and create tools to build their mind maps in MR/VR space.
Tools:
- Sticky Notes
- Labels
- Arrows
- Boards
- Shapes
- Stickers
- Timers
The main class of the project, and the one that controls all the experience is the ImmersiveActivity class.
Focus project contains the following core classes:
Tool (2D and 3D): Contains most of the spatial tools the users can create in the experience.
StickyNote: Creates a sticky note tool.
SpatialTask: Creates a spatial task with data already existing in the database.
WebView: Creates a browser panel.
Timer: Creates timers with different duration.
DatabaseManager: creation of local database and methods to save and retrieve data.
Data: File that contains general data of the project, referencing drawables, buttons, etc.
Utils: file containing useful general functions. Helpers to save time for developers.
AIUtils: methods to communicate with the AI backend server.
Layouts: to create all the panels of the experience.
Custom components:
- UniqueAssetComponent: allow us to identify all entities that are unique (Clock, Speaker, AI Exchange Panel, Tasks Panel).
- ToolComponent: allow us to identify and save properties of tool assets, as type of tool and position to show the delete button.
- TimeComponent: allow us to identify and save properties of Clock and Timer tool.
Custom systems:
- GeneralSystem: Controls the app introduction timing and controller inputs.
- DatabaseUpdateSystem: Update the position of moved spatial objects in the database .
- UpdateTimeSystem: Updates the UI of the clock and timers in the experience.
- BoardParentingSystem: Detects when an object is close to a board and "stick" it to the board.
Documentation of the code can be found here.
Creating spatial objects: object hierarchy
- Composed objects, multiple objects behaving as a single one.
- Spatial audio integrated. Audio coming from different locations or objects.
- Custom helper functions for developers to get children of spatial objects.
- Deletion of composed objects. Recursive deletion of objects to remove composed objects.
Store multiple room configurations: Update data and persistence
- SQLite database. Introduction on how we store spatial data and object relationship.
- Use components to save and retrieve object data and state.
- Helper system to update objects positions. Store states on database.
- Detect keyboard events. Update content with user interactions.
Spawn objects relative to user's position.
- Placing objects facing the user.
Panels and interaction with spatial objects
- Creating a panel depending on the type.
- Access panels and give them functionality.
- Panels transparency and spatial text.
- Load and switch between panoramas (skybox).
- Load and switch between 3D model scenes.
- Change the lighting environment accordingly.
This project makes use of the following plugins and software:
Meta Spatial SDK