{"id":5183,"date":"2024-07-22T07:01:42","date_gmt":"2024-07-22T07:01:42","guid":{"rendered":"https:\/\/mixedrealitynow.com\/?p=5183"},"modified":"2024-10-20T06:39:38","modified_gmt":"2024-10-20T06:39:38","slug":"building-mr-apps-using-physical-surfaces-has-never-been-easier-how-to-use-meta-mixed-reality-utility-kit-mruk","status":"publish","type":"post","link":"https:\/\/mixedrealitynow.com\/ko\/building-mr-apps-using-physical-surfaces-has-never-been-easier-how-to-use-meta-mixed-reality-utility-kit-mruk","title":{"rendered":"Building MR apps using physical surfaces with Meta MR Utility Kit"},"content":{"rendered":"<p><\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" autoplay controls loop muted src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/MRUK_Cover_480p.mp4\"><\/video><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>Being able to use the real-life physical environment as a canvas is one of the most exciting parts of Mixed Reality for us designers, developers, and creators. With Meta Quest&#8217;s various sophisticated spatial awareness capabilities such as <a href=\"https:\/\/developer.oculus.com\/resources\/mr-design-scene\/\" target=\"_blank\" rel=\"noopener\" title=\"\">Scene Understanding<\/a>, <a href=\"https:\/\/developer.oculus.com\/documentation\/unity\/unity-scene-build-mixed-reality\/#scene-mesh\" target=\"_blank\" rel=\"noopener\" title=\"\">Scene Mesh<\/a>, and <a href=\"https:\/\/developer.oculus.com\/documentation\/unity\/unity-depthapi\/\" target=\"_blank\" rel=\"noopener\" title=\"\">Depth API<\/a>, you can design and build experiences where your virtual content and objects interact with the physical environment.<\/p>\n\n\n\n<p>However, it could also be challenging for designers and developers who are new to Mixed Reality to build experiences from scratch. With Meta&#8217;s Mixed Reality Utility Kit (MRUK), you can easily retrieve the surfaces from the real-world environment, and place your content dynamically on specific surface types. MRUK&#8217;s various utilities and example scenes will allow you to jump-start building Mixed Reality experiences.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Official Documentation<\/h2>\n\n\n\n<p>Official documentation for MRUK can be found here:<br><a href=\"https:\/\/developer.oculus.com\/documentation\/unity\/unity-mr-utility-kit-overview\">https:\/\/developer.oculus.com\/documentation\/unity\/unity-mr-utility-kit-overview<\/a><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">How does it work?<\/h2>\n\n\n\n<p>The overall flow looks like this: <\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>If there is no existing user&#8217;s room environment data, the user needs to go through room scanning using Quest&#8217;s Room Setup experience in Settings.<\/li>\n\n\n\n<li>With the room data, MRUK can retrieve Scene information about the room such as semantic surface types and Mesh. You can simply do this by placing <strong>MRUK.prefab<\/strong> in your scene.<\/li>\n\n\n\n<li>MRUK provides tools such as <strong>FindSpawnPositions <\/strong>which allows us to easily dynamically place content objects in space with various options &#8211; surface types (floor, table, couch, etc) and locations (on top of surfaces, floating, etc)<\/li>\n<\/ul>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Install Mixed Reality Utility Kit<\/h2>\n\n\n\n<p>You can find and install MRUK from the Unity Asset Store.<\/p>\n\n\n\n<p><strong>Link<\/strong>: <a href=\"https:\/\/assetstore.unity.com\/packages\/tools\/integration\/meta-mr-utility-kit-272450\">https:\/\/assetstore.unity.com\/packages\/tools\/integration\/meta-mr-utility-kit-272450<\/a><\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"2574\" height=\"1827\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_08_16-Meta-MR-Utility-Kit-_-Integration-_-Unity-Asset-Store.png\" alt=\"\" class=\"wp-image-5225\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_08_16-Meta-MR-Utility-Kit-_-Integration-_-Unity-Asset-Store.png 2574w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_08_16-Meta-MR-Utility-Kit-_-Integration-_-Unity-Asset-Store-300x213.png 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_08_16-Meta-MR-Utility-Kit-_-Integration-_-Unity-Asset-Store-1024x727.png 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_08_16-Meta-MR-Utility-Kit-_-Integration-_-Unity-Asset-Store-768x545.png 768w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_08_16-Meta-MR-Utility-Kit-_-Integration-_-Unity-Asset-Store-1536x1090.png 1536w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_08_16-Meta-MR-Utility-Kit-_-Integration-_-Unity-Asset-Store-2048x1454.png 2048w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_08_16-Meta-MR-Utility-Kit-_-Integration-_-Unity-Asset-Store-18x12.png 18w\" sizes=\"auto, (max-width: 2574px) 100vw, 2574px\" \/><\/figure>\n\n\n\n<p>In the <strong>Window &gt; Package Manager<\/strong>, make sure to import sample scenes under the <strong>Samples <\/strong>tab.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"2308\" height=\"1646\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_20_49-Package-Manager.png\" alt=\"\" class=\"wp-image-5226\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_20_49-Package-Manager.png 2308w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_20_49-Package-Manager-300x214.png 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_20_49-Package-Manager-1024x730.png 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_20_49-Package-Manager-768x548.png 768w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_20_49-Package-Manager-1536x1095.png 1536w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_20_49-Package-Manager-2048x1461.png 2048w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_20_49-Package-Manager-18x12.png 18w\" sizes=\"auto, (max-width: 2308px) 100vw, 2308px\" \/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Enable Scene support in your project<\/h2>\n\n\n\n<p>Under <strong>OVRManager<\/strong>, update <strong>Scene Support<\/strong> to Supported or Required. Check <strong>Scene <\/strong>under <strong>Permission Requests on Startup.<\/strong> For the camera rig, <strong>OVRCameraRigInteraction <\/strong>is recommended which is provided through <strong>Meta XR Interaction SDK<\/strong>. It supports all available input modalities such as direct hand, hand ray, and controller ray. <a href=\"https:\/\/mixedrealitynow.com\/ko\/getting-started-with-meta-xr-interaction-sdk-quest-3-how-to-crucial-interactions\/\" target=\"_blank\" rel=\"noopener\" title=\"\">Check out this post for more details on spatial input interactions with Interaction SDK.<\/a><\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1620\" height=\"2116\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_51_19-CapCut.png\" alt=\"\" class=\"wp-image-5227\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_51_19-CapCut.png 1620w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_51_19-CapCut-230x300.png 230w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_51_19-CapCut-784x1024.png 784w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_51_19-CapCut-768x1003.png 768w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_51_19-CapCut-1176x1536.png 1176w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_51_19-CapCut-1568x2048.png 1568w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-22_51_19-CapCut-9x12.png 9w\" sizes=\"auto, (max-width: 1620px) 100vw, 1620px\" \/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Room Setup through Quest&#8217;s Settings menu<\/h2>\n\n\n\n<p>Before connecting the USB-C cable for Quest Link, run the room setup through Quest&#8217;s Settings menu. This will make the room data ready for Unity and MRUK.<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/MRUK1_480p.mp4\"><\/video><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>You can see identified surface types with semantic labels. This information will be accessible through MRUK in Unity.<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/MRUK2_480p.mp4\"><\/video><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>If your room information is not available in the system, MRUK provides sample environments as a fallback for development and testing. You can still run the experiences using these dummy environments.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Run the Example scenes and Feel the Magic<\/h2>\n\n\n\n<p>Now your physical room data is ready, you can try running MRUK&#8217;s example scenes through Quest Link. Being able to see visualized surfaces through passthrough Mixed Reality feels magical. MRUK provides practical and powerful examples such as dynamic object spawning and nav mesh in the physical environment.<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/MRUK-Sample-Link.mp4\"><\/video><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Adding MRUK.prefab into your project<\/h2>\n\n\n\n<p>When you open the example scenes provided by MRUK, you can see that <strong>MRUK prefab<\/strong> is the main component that provides the Scene Understanding capabilities. It exposes useful events such as <strong>Scene Loaded Event ()<\/strong> where you can initialize your app experience (e.g. spawn your content objects using the identified surfaces)<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1686\" height=\"1600\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-23_01_10-CapCut.png\" alt=\"\" class=\"wp-image-5228\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-23_01_10-CapCut.png 1686w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-23_01_10-CapCut-300x285.png 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-23_01_10-CapCut-1024x972.png 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-23_01_10-CapCut-768x729.png 768w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-23_01_10-CapCut-1536x1458.png 1536w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-23_01_10-CapCut-13x12.png 13w\" sizes=\"auto, (max-width: 1686px) 100vw, 1686px\" \/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>Simply drag and drop <strong>MRUK.prefab<\/strong> into your project. <\/p>\n\n\n\n<p>If you just run the project with Unity&#8217;s Play button, you won&#8217;t be able to see anything. You need to add another object called <strong>EffectMesh <\/strong>prefab into the scene to see the visualization of the surfaces.<\/p>\n\n\n\n<p>When you run the scene, you might see your environment is filled with blue color, and difficult to see the outline of the surfaces. <\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"3829\" height=\"2304\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-17_05_07-News-Space-NewsSpace-Android-Unity-2022.3.23f1-_DX11_.png\" alt=\"\" class=\"wp-image-5232\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-17_05_07-News-Space-NewsSpace-Android-Unity-2022.3.23f1-_DX11_.png 3829w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-17_05_07-News-Space-NewsSpace-Android-Unity-2022.3.23f1-_DX11_-300x181.png 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-17_05_07-News-Space-NewsSpace-Android-Unity-2022.3.23f1-_DX11_-1024x616.png 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-17_05_07-News-Space-NewsSpace-Android-Unity-2022.3.23f1-_DX11_-768x462.png 768w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-17_05_07-News-Space-NewsSpace-Android-Unity-2022.3.23f1-_DX11_-1536x924.png 1536w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-17_05_07-News-Space-NewsSpace-Android-Unity-2022.3.23f1-_DX11_-2048x1232.png 2048w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-17_05_07-News-Space-NewsSpace-Android-Unity-2022.3.23f1-_DX11_-18x12.png 18w\" sizes=\"auto, (max-width: 3829px) 100vw, 3829px\" \/><\/figure>\n\n\n\n<p>You can uncheck <strong>GLOBAL_MESH <\/strong>under the <strong>Labels <\/strong>dropdown to hide this visual of global mesh.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"615\" height=\"1024\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-20_46_27-Meta-Quest-Developer-Hub-615x1024.png\" alt=\"\" class=\"wp-image-5234\" style=\"width:459px;height:auto\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-20_46_27-Meta-Quest-Developer-Hub-615x1024.png 615w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-20_46_27-Meta-Quest-Developer-Hub-180x300.png 180w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-20_46_27-Meta-Quest-Developer-Hub-768x1280.png 768w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-20_46_27-Meta-Quest-Developer-Hub-922x1536.png 922w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-20_46_27-Meta-Quest-Developer-Hub-7x12.png 7w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-20_46_27-Meta-Quest-Developer-Hub.png 1085w\" sizes=\"auto, (max-width: 615px) 100vw, 615px\" \/><\/figure>\n\n\n\n<p>Now you can properly see the edges of the identified surfaces.<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/MRUK-SU1.mp4\"><\/video><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Dynamically Spawning and Placing Your Content<\/h2>\n\n\n\n<p>This is one of the most difficult part in building apps for Mixed Reality &#8211; How can we place content when we don&#8217;t know the user&#8217;s room environment? Some of the users won&#8217;t have table or couch in their rooms or the size of the room could be small, etc. How can we place our content only on specific area or surface type? <\/p>\n\n\n\n<p>MRUK provides powerful tools to help you dynamically place content in varying environment. <strong>FindSpawnPositions <\/strong>script is one of the tools provided by MRUK which allows you to easily spawn and place your content on specific surface types (or floating in the room area). You can specify surface types and how to place the content on those surfaces. For example, you can place your content only on tables or make your virtual picture frame placed only on the walls.<\/p>\n\n\n\n<p>Below is an example of placing multiple Mars Rover 3D model objects on the surface labeled as <strong>FLOOR<\/strong>. Assigned MarsRover prefab object to <strong>Spawn Object<\/strong> field and set <strong>Spawn Amount<\/strong> as 8 to place eight Mars Rover object in the room.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"743\" height=\"1024\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-21_03_21-UnityEditor.PopupWindow-743x1024.png\" alt=\"\" class=\"wp-image-5238\" style=\"width:494px;height:auto\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-21_03_21-UnityEditor.PopupWindow-743x1024.png 743w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-21_03_21-UnityEditor.PopupWindow-218x300.png 218w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-21_03_21-UnityEditor.PopupWindow-768x1058.png 768w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-21_03_21-UnityEditor.PopupWindow-9x12.png 9w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-21_03_21-UnityEditor.PopupWindow.png 925w\" sizes=\"auto, (max-width: 743px) 100vw, 743px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"490\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-21_02_25-News-Space-NewsSpace-Android-Unity-2022.3.23f1_-_DX11_-1024x490.png\" alt=\"\" class=\"wp-image-5237\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-21_02_25-News-Space-NewsSpace-Android-Unity-2022.3.23f1_-_DX11_-1024x490.png 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-21_02_25-News-Space-NewsSpace-Android-Unity-2022.3.23f1_-_DX11_-300x144.png 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-21_02_25-News-Space-NewsSpace-Android-Unity-2022.3.23f1_-_DX11_-768x367.png 768w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-21_02_25-News-Space-NewsSpace-Android-Unity-2022.3.23f1_-_DX11_-1536x735.png 1536w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-21_02_25-News-Space-NewsSpace-Android-Unity-2022.3.23f1_-_DX11_-18x9.png 18w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/2024-07-21-21_02_25-News-Space-NewsSpace-Android-Unity-2022.3.23f1_-_DX11_.png 2034w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>Below is the result:<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/MRUK_Spawn_Rover480p.mp4\"><\/video><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">SceneDecorator example scene<\/h2>\n\n\n\n<p>One of the example scenes provided by MRUK demonstrates dynamic content placement based on surface types. <\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/0723-MRUK-RoomDeco-Device480p.mp4\"><\/video><\/figure>\n\n\n\n<p>It also shows how to request Space Setup which allows users to adjust room environment using <strong>OVRScene.RequestSpaceSetup<\/strong>().<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/0723-MRUK-RequestSetup.mp4\"><\/video><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">Check out documentation for more features<\/h2>\n\n\n\n<p>MRUK provides many helpful tools that can accelerate your design &amp; development iterations for Mixed Reality experiences. Check out the features page in MRUK documentation to learn more:<\/p>\n\n\n\n<p><a href=\"https:\/\/developer.oculus.com\/documentation\/unity\/unity-mr-utility-kit-features\">https:\/\/developer.oculus.com\/documentation\/unity\/unity-mr-utility-kit-features<\/a><\/p>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>","protected":false},"excerpt":{"rendered":"<p>Being able to use the real-life physical environment as a canvas is one of the [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":5249,"comment_status":"closed","ping_status":"open","sticky":true,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[31,6],"tags":[],"class_list":["post-5183","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-articles","category-featured"],"aioseo_notices":[],"jetpack_featured_media_url":"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/07\/vlcsnap-2024-07-22-09h01m06s612a.png","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/posts\/5183","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/comments?post=5183"}],"version-history":[{"count":16,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/posts\/5183\/revisions"}],"predecessor-version":[{"id":5452,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/posts\/5183\/revisions\/5452"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/media\/5249"}],"wp:attachment":[{"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/media?parent=5183"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/categories?post=5183"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/tags?post=5183"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}