Being able to use the real-life physical environment as a canvas is one of the most exciting parts of Mixed Reality for us designers, developers, and creators. With Meta Quest’s various sophisticated spatial awareness capabilities such as Scene Understanding, Scene Mesh, and Depth API, you can design and build experiences where your virtual content and objects interact with the physical environment.
However, it could also be challenging for designers and developers who are new to Mixed Reality to build experiences from scratch. With Meta’s Mixed Reality Utility Kit (MRUK), you can easily retrieve the surfaces from the real-world environment, and place your content dynamically on specific surface types. MRUK’s various utilities and example scenes will allow you to jump-start building Mixed Reality experiences.
Official Documentation
Official documentation for MRUK can be found here:
https://developer.oculus.com/documentation/unity/unity-mr-utility-kit-overview
How does it work?
The overall flow looks like this:
- If there is no existing user’s room environment data, the user needs to go through room scanning using Quest’s Room Setup experience in Settings.
- With the room data, MRUK can retrieve Scene information about the room such as semantic surface types and Mesh. You can simply do this by placing MRUK.prefab in your scene.
- MRUK provides tools such as FindSpawnPositions which allows us to easily dynamically place content objects in space with various options – surface types (floor, table, couch, etc) and locations (on top of surfaces, floating, etc)
Install Mixed Reality Utility Kit
You can find and install MRUK from the Unity Asset Store.
Link: https://assetstore.unity.com/packages/tools/integration/meta-mr-utility-kit-272450
In the Window > Package Manager, make sure to import sample scenes under the Samples tab.
Enable Scene support in your project
Under OVRManager, update Scene Support to Supported or Required. Check Scene under Permission Requests on Startup. For the camera rig, OVRCameraRigInteraction is recommended which is provided through Meta XR Interaction SDK. It supports all available input modalities such as direct hand, hand ray, and controller ray. Check out this post for more details on spatial input interactions with Interaction SDK.
Room Setup through Quest’s Settings menu
Before connecting the USB-C cable for Quest Link, run the room setup through Quest’s Settings menu. This will make the room data ready for Unity and MRUK.
You can see identified surface types with semantic labels. This information will be accessible through MRUK in Unity.
If your room information is not available in the system, MRUK provides sample environments as a fallback for development and testing. You can still run the experiences using these dummy environments.
Run the Example scenes and Feel the Magic
Now your physical room data is ready, you can try running MRUK’s example scenes through Quest Link. Being able to see visualized surfaces through passthrough Mixed Reality feels magical. MRUK provides practical and powerful examples such as dynamic object spawning and nav mesh in the physical environment.
Adding MRUK.prefab into your project
When you open the example scenes provided by MRUK, you can see that MRUK prefab is the main component that provides the Scene Understanding capabilities. It exposes useful events such as Scene Loaded Event () where you can initialize your app experience (e.g. spawn your content objects using the identified surfaces)
Simply drag and drop MRUK.prefab into your project.
If you just run the project with Unity’s Play button, you won’t be able to see anything. You need to add another object called EffectMesh prefab into the scene to see the visualization of the surfaces.
When you run the scene, you might see your environment is filled with blue color, and difficult to see the outline of the surfaces.
You can uncheck GLOBAL_MESH under the Labels dropdown to hide this visual of global mesh.
Now you can properly see the edges of the identified surfaces.
Dynamically Spawning and Placing Your Content
This is one of the most difficult part in building apps for Mixed Reality – How can we place content when we don’t know the user’s room environment? Some of the users won’t have table or couch in their rooms or the size of the room could be small, etc. How can we place our content only on specific area or surface type?
MRUK provides powerful tools to help you dynamically place content in varying environment. FindSpawnPositions script is one of the tools provided by MRUK which allows you to easily spawn and place your content on specific surface types (or floating in the room area). You can specify surface types and how to place the content on those surfaces. For example, you can place your content only on tables or make your virtual picture frame placed only on the walls.
Below is an example of placing multiple Mars Rover 3D model objects on the surface labeled as FLOOR. Assigned MarsRover prefab object to Spawn Object field and set Spawn Amount as 8 to place eight Mars Rover object in the room.
Below is the result:
SceneDecorator example scene
One of the example scenes provided by MRUK demonstrates dynamic content placement based on surface types.
It also shows how to request Space Setup which allows users to adjust room environment using OVRScene.RequestSpaceSetup().
Check out documentation for more features
MRUK provides many helpful tools that can accelerate your design & development iterations for Mixed Reality experiences. Check out the features page in MRUK documentation to learn more:
https://developer.oculus.com/documentation/unity/unity-mr-utility-kit-features