{"id":4469,"date":"2022-12-04T22:24:48","date_gmt":"2022-12-04T22:24:48","guid":{"rendered":"https:\/\/dongyoonpark.com\/?p=4469"},"modified":"2025-12-30T01:21:04","modified_gmt":"2025-12-30T01:21:04","slug":"designing-type-in-space-for-hololens-2","status":"publish","type":"post","link":"https:\/\/mixedrealitynow.com\/ko\/designing-type-in-space-for-hololens-2","title":{"rendered":"\ud640\ub85c\ub80c\uc9882\uc6a9 \ud0c0\uc785 \uc778 \uc2a4\ud398\uc774\uc2a4 (Type In Space) \uc571 \ub514\uc790\uc778 \uc2a4\ud1a0\ub9ac"},"content":{"rendered":"<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1920\" height=\"1080\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/TypeInSpace2Hero_1920x1080.jpg\" alt=\"\" class=\"wp-image-4656\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/TypeInSpace2Hero_1920x1080.jpg 1920w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/TypeInSpace2Hero_1920x1080-300x169.jpg 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/TypeInSpace2Hero_1920x1080-1024x576.jpg 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/TypeInSpace2Hero_1920x1080-768x432.jpg 768w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/TypeInSpace2Hero_1920x1080-1536x864.jpg 1536w\" sizes=\"auto, (max-width: 1920px) 100vw, 1920px\" \/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"2896\">\ud504\ub85c\uc81d\ud2b8 \ubc30\uacbd<\/h2>\n\n\n\n<p>As a designer passionate about typography and spatial computing, I\u2019ve long been captivated by the idea of placing beautiful type into real-world space. With <a href=\"https:\/\/www.microsoft.com\/en-us\/hololens\" target=\"_blank\" rel=\"noopener\" title=\"\">\ub9c8\uc774\ud06c\ub85c\uc18c\ud504\ud2b8 \ud640\ub85c\ub80c\uc988\ub294<\/a>, you can anchor holographic objects in your physical environment \u2014 on tables, walls, or in mid-air \u2014 and walk around them just like real objects. <a href=\"https:\/\/mixedrealitynow.com\/ko\/designing-type-in-space-for-hololens-2\/?utm_source=chatgpt.com\" target=\"_blank\" rel=\"noreferrer noopener\"><\/a><\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Introducing Microsoft HoloLens 2\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/eqFqtAJMtYE?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p>My journey began in 2016 with <em><a href=\"https:\/\/mixedrealitynow.com\/ko\/designing-typography-insight-for-hololens\/\" target=\"_blank\" rel=\"noreferrer noopener\"><strong>Typography Insight for HoloLens (2016)<\/strong><\/a><\/em>, an early experiment that let users explore and manipulate holographic type in real space. That work evolved into <a href=\"https:\/\/mixedrealitynow.com\/ko\/type-in-space-explore-spatial-typography-in-mixed-reality-with-hololens\/\" target=\"_blank\" rel=\"noreferrer noopener\"><strong>Type In Space (2018)<\/strong><\/a>, a full app focused on spatial typography for HoloLens. <a href=\"https:\/\/mixedrealitynow.com\/ko\/designing-type-in-space-for-hololens-2\/?utm_source=chatgpt.com\" target=\"_blank\" rel=\"noreferrer noopener\"><\/a><\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Type In Space - Lay out and Experience Type In Mixed Reality with HoloLens\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/qJyr4C6Weck?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p id=\"a449\">Being able to see and interact with a holographic type in a real-world environment is one of the most magical experiences. With HoloLens 2, I had the opportunity to re-imagine this experience using the device\u2019s advanced input capabilities. Instead of indirect controls, users can now <strong>touch, grab, and manipulate type directly with their hands<\/strong> \u2014 a leap toward truly instinctual mixed reality interaction. <a href=\"https:\/\/mixedrealitynow.com\/ko\/designing-type-in-space-for-hololens-2\/?utm_source=chatgpt.com\" target=\"_blank\" rel=\"noreferrer noopener\"><\/a><\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Type In Space for HoloLens 2 - Spatial Typography in Mixed Reality (2019)\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/tRoDAdLhO3I?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"8a53\">\ud640\ub85c\ub80c\uc988 2\uc758 \uc9c1\uad00\uc801\uc778 \uc778\ud130\ub809\uc158<\/h2>\n\n\n\n<p>HoloLens 2 introduced <strong>fully articulated hand-tracking and eye-tracking input<\/strong>, enabling direct, physical-like interaction with holograms that previously felt abstract. This shift changed the interaction model from controller or gesture-based input to something far more natural \u2014 users could reach out and engage with type the same way they do with physical objects. <\/p>\n\n\n\n<p id=\"436a\">This instinctual interaction was the foundational principle of the <em>Type In Space<\/em> redesign: make complex spatial typography feel <strong>intuitive, tactile, and expressive<\/strong>.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"fcfe\">MRTK: \uacf5\uac04\uc801\uc778 \uc778\ud130\ub809\uc158\uacfc UI\ub97c \uc704\ud55c \ucef4\ud3ec\ub10c\ud2b8<\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1050\" height=\"274\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_ZnCo9rEAd1S5qZUG2g8mBQ.png\" alt=\"\" class=\"wp-image-4780\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_ZnCo9rEAd1S5qZUG2g8mBQ.png 1050w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_ZnCo9rEAd1S5qZUG2g8mBQ-300x78.png 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_ZnCo9rEAd1S5qZUG2g8mBQ-1024x267.png 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_ZnCo9rEAd1S5qZUG2g8mBQ-768x200.png 768w\" sizes=\"auto, (max-width: 1050px) 100vw, 1050px\" \/><\/figure>\n\n\n\n<p id=\"2cc1\"><a href=\"https:\/\/github.com\/Microsoft\/MixedRealityToolkit-Unity\" rel=\"noreferrer noopener\" target=\"_blank\"><strong>MRTK(Mixed Reality Toolkit)<\/strong>&nbsp;\ub294 \ub9c8\uc774\ud06c\ub85c\uc18c\ud504\ud2b8\uc758 \uc624\ud508\uc18c\uc2a4 \ud504\ub85c\uc81d\ud2b8 \uc785\ub2c8\ub2e4.<\/a>. MRTK-Unity\ub294 Unity\uc5d0\uc11c \ud63c\ud569 \ud604\uc2e4 \uc571 \ub514\uc790\uc778 \ubc0f \uac1c\ubc1c\uc744 \uc190\uc27d\uac8c \ud558\uae30 \uc704\ud55c \ud544\uc218\uc801\uc778 \ucef4\ud3ec\ub10c\ud2b8 \ubc0f \uae30\ub2a5\uc744 \uc81c\uacf5\ud569\ub2c8\ub2e4. MRTK v2\uc758 \ucd5c\uc2e0 \ub9b4\ub9ac\uc2a4\ub294 HoloLens\/HoloLens 2, Windows Mixed Reality \ubc0f OpenVR \ud50c\ub7ab\ud3fc\uc744 \uc9c0\uc6d0\ud569\ub2c8\ub2e4.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"MRTK(Mixed Reality Toolkit) v2.1.0\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/p_FI0u5o8cw?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p>MRTK v2\ub294 \ucc98\uc74c\ubd80\ud130 \uc644\uc804\ud788 \uc7ac\uc124\uacc4\ub418\uc5c8\uc73c\ubbc0\ub85c<strong>&nbsp;<\/strong>\uc785\ub825 \uc2dc\uc2a4\ud15c\uc740 MRTK v1(HoloToolkit)\uacfc \ud638\ud658\ub418\uc9c0 \uc54a\uc2b5\ub2c8\ub2e4. \uc800\uc758 \uc571 Type In Space\uc758 \uc6d0\ub798 \ubc84\uc804\uc5d0\uc11c\ub294 HoloToolkit\uc744 \uc0ac\uc6a9\ud588\uae30 \ub54c\ubb38\uc5d0 \uc774\ubc88\uc5d0\ub294 MRTK v2\ub85c \uc0c8\ub85c\uc6b4 \ud504\ub85c\uc81d\ud2b8\ub97c \ub9cc\ub4e4\uae30 \uc2dc\uc791\ud588\uc2b5\ub2c8\ub2e4. \ub300\ubd80\ubd84\uc758 \ud575\uc2ec \uc0c1\ud638 \uc791\uc6a9\uc740 MRTK v2\uc758 \ucef4\ud3ec\ub10c\ud2b8\ub4e4\ub85c \ub2ec\uc131\ud560 \uc218 \uc788\uae30 \ub54c\ubb38\uc5d0 \uc0dd\uac01\ubcf4\ub2e4 \ube60\ub974\uac8c \uad6c\ud604\ud560 \uc218 \uc788\uc5c8\uc2b5\ub2c8\ub2e4.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"2e91\">Text Object as Core Component<\/h2>\n\n\n\n<p id=\"012d\">At the heart of the experience is the holographic text object. I chose <strong>TextMesh Pro<\/strong> for its high-quality rendering and sharpness across distances \u2014 essential given the HoloLens\u2019s high pixel density. <a href=\"https:\/\/mixedrealitynow.com\/ko\/designing-type-in-space-for-hololens-2\/\" target=\"_blank\" rel=\"noreferrer noopener\"><\/a><\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1920\" height=\"1080\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/TypeInSpace122319C_NoText_Trim_Moment.jpg\" alt=\"\" class=\"wp-image-5203\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/TypeInSpace122319C_NoText_Trim_Moment.jpg 1920w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/TypeInSpace122319C_NoText_Trim_Moment-300x169.jpg 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/TypeInSpace122319C_NoText_Trim_Moment-1024x576.jpg 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/TypeInSpace122319C_NoText_Trim_Moment-768x432.jpg 768w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/TypeInSpace122319C_NoText_Trim_Moment-1536x864.jpg 1536w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/TypeInSpace122319C_NoText_Trim_Moment-18x10.jpg 18w\" sizes=\"auto, (max-width: 1920px) 100vw, 1920px\" \/><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"8f71\"><strong>Text Mesh Pro (\ud14d\uc2a4\ud2b8 \ub9e4\uc2dc \ud504\ub85c)<\/strong><\/h3>\n\n\n\n<p id=\"738b\">HoloLens\ub294 47 PPD(Pixels Per Degree)\uc758 \uace0\ud574\uc0c1\ub3c4 \ub514\uc2a4\ud50c\ub808\uc774\ub97c \uac16\ucd94\uace0 \uc788\uc5b4 \uc120\uba85\ud558\uace0 \uc544\ub984\ub2e4\uc6b4 \ud14d\uc2a4\ud2b8\ub97c \ud45c\uc2dc\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4. \uc774 \uace0\ud574\uc0c1\ub3c4\ub97c \uc801\uc808\ud558\uac8c \ud65c\uc6a9\ud558\ub824\uba74 \uc801\uc808\ud558\uac8c \ucd5c\uc801\ud654\ub41c \ud14d\uc2a4\ud2b8 \uad6c\uc131 \uc694\uc18c\ub97c \uc0ac\uc6a9\ud558\ub294 \uac83\uc774 \uc911\uc694\ud569\ub2c8\ub2e4. Unity\uc758 TextMesh Pro\ub294 SDF(<a href=\"https:\/\/steamcdn-a.akamaihd.net\/apps\/valve\/2007\/SIGGRAPH2007_AlphaTestedMagnification.pdf\" rel=\"noreferrer noopener\" target=\"_blank\">Signed Distance Field<\/a>) \ub77c\ub294 \uae30\uc220\uc744 \uc774\uc6a9\ud558\uc5ec \uac70\ub9ac\uc5d0 \uad00\uacc4\uc5c6\uc774 \uc120\uba85\ud558\uace0 \uc120\uba85\ud55c \ud14d\uc2a4\ud2b8\ub97c \ud45c\uc2dc\ud569\ub2c8\ub2e4.&nbsp;<a href=\"https:\/\/docs.microsoft.com\/en-us\/windows\/mixed-reality\/typography\" rel=\"noreferrer noopener\" target=\"_blank\"><strong>Typography guideline&nbsp;<\/strong><\/a>and&nbsp;<a href=\"https:\/\/docs.microsoft.com\/en-us\/windows\/mixed-reality\/text-in-unity\" rel=\"noreferrer noopener\" target=\"_blank\"><strong>Text in Unity<\/strong><\/a>&nbsp;on Mixed Reality Dev Center.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"7701\">Direct Manipulation and Hand Interaction &#8211; <strong>Near &amp; Far Field Interactions<\/strong><\/h3>\n\n\n\n<p id=\"fe3b\">Using MRTK\u2019s manipulation handlers, users can grab and transform text with one or two hands. This enables expressive positioning, scaling, and rotation directly in 3D space \u2014 no indirect controls or menus needed. <a href=\"https:\/\/mixedrealitynow.com\/ko\/designing-type-in-space-for-hololens-2\/\" target=\"_blank\" rel=\"noreferrer noopener\"><\/a><\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/NearFarManipulation.mp4\"><\/video><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"931f\"><strong>Bounding Box<\/strong><\/h3>\n\n\n\n<p id=\"5bd7\">The bounding box is a standard interface for the precise scale and rotation of an object in HoloLens. For the Type In Space app, I used it to indicate the currently selected text objects by displaying the corner handles.&nbsp;<a href=\"https:\/\/microsoft.github.io\/MixedRealityToolkit-Unity\/Documentation\/README_BoundingBox.html\" rel=\"noreferrer noopener\" target=\"_blank\"><strong>MRTK\u2019s Bounding Box<\/strong><\/a>&nbsp;provides various configurable options for the visual representation of the handles as well as the behaviors.<\/p>\n\n\n\n<p id=\"5bd7\"><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"6892\">Menu UI for Text Properties<\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1050\" height=\"530\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_80ZXTLzADug9gR01NbKOJw.png\" alt=\"\" class=\"wp-image-4773\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_80ZXTLzADug9gR01NbKOJw.png 1050w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_80ZXTLzADug9gR01NbKOJw-300x151.png 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_80ZXTLzADug9gR01NbKOJw-1024x517.png 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_80ZXTLzADug9gR01NbKOJw-768x388.png 768w\" sizes=\"auto, (max-width: 1050px) 100vw, 1050px\" \/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"bcdf\">Button<\/h2>\n\n\n\n<p id=\"dd6d\">In HoloLens 2, button input occurs through direct hand-tracking without any physical surface. As a result, users receive no tactile confirmation at the moment of activation. To maintain clarity and reduce ambiguity, the button interaction model relies heavily on strong visual affordances, touch-proximity cues, and spatial audio feedback to simulate a sense of \u201cpress.\u201d<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/HL2Button.mp4\"><\/video><\/figure>\n\n\n\n<p id=\"dd2b\"><a href=\"https:\/\/microsoft.github.io\/MixedRealityToolkit-Unity\/Documentation\/README_Button.html\" target=\"_blank\" rel=\"noreferrer noopener\"><strong>MRTK\u2019s HoloLens 2 style button<\/strong><\/a>&nbsp;provides rich visual\/audio cues and handles complex logic for the speed\/trajectory\/direction of the finger movements. Visual feedback includes proximity-based lighting, highlight box, compressing front cage, hover light on the surface, pulse effect on press event trigger, and the fingertip cursor.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"6716\">Hand Menu<\/h2>\n\n\n\n<p>In the original version, the text tools were accessed through a floating tag-along menu that followed the user to remain always available. With HoloLens 2, a new interaction pattern \u2014 the <em>hand menu<\/em> \u2014 emerged. Hand menus use hand-tracking to surface contextual controls near the user\u2019s hand only when needed. This eliminates UI clutter, shortens interaction travel distance, and lets users stay focused on the content.<br>I adopted this pattern for text property controls and tested its effectiveness. Below are Mixed Reality Capture clips recorded on the HoloLens 2 device.<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/HandMenu01.mp4\"><\/video><\/figure>\n\n\n\n<p id=\"914f\">As shown in the capture, the interaction works well \u2014 users can simply raise their palm to reveal the menu, adjust text properties, and continue working. However, repeatedly holding the palm up can introduce arm and shoulder fatigue, especially when making multiple adjustments. To address this, I added an option to \u201cworld-lock\u201d the menu, allowing users to pin it in place or pull it out into the environment for longer editing sessions.<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/HandMenu02.mp4\"><\/video><\/figure>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/HandMenu04-Detach.mp4\"><\/video><\/figure>\n\n\n\n<p id=\"6ed9\">While the hand menu felt natural when working with text objects close to the user, usability issues emerged in far-field scenarios. To edit text placed several meters away, I needed to maintain visual attention on the distant object while simultaneously checking the menu near my hand. This forced <strong>continuous eye-focus switching between depth planes<\/strong>, creating noticeable discomfort and eye fatigue within a short period. It highlighted a core ergonomic challenge unique to spatial UX \u2014 <em>where UI lives in depth matters just as much as how it behaves<\/em>.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1050\" height=\"553\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_jhXWWFShqkNoLB8H96p2jQ.png\" alt=\"\" class=\"wp-image-4776\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_jhXWWFShqkNoLB8H96p2jQ.png 1050w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_jhXWWFShqkNoLB8H96p2jQ-300x158.png 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_jhXWWFShqkNoLB8H96p2jQ-1024x539.png 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_jhXWWFShqkNoLB8H96p2jQ-768x404.png 768w\" sizes=\"auto, (max-width: 1050px) 100vw, 1050px\" \/><figcaption class=\"wp-element-caption\">Focal depth switching between the target object and the menu causes the eye strain<\/figcaption><\/figure>\n\n\n\n<p id=\"969a\">One potential solution was to attach the text-related menu directly to each text object. However, for content positioned far away, the menu would need to scale up significantly to remain usable with hand rays or gaze-based pointers. This would visually dominate the scene and compete with the text itself, which should remain the hero. Additionally, I wanted to preserve direct, fingertip interaction for near-field editing \u2014 something that a purely object-attached menu wouldn\u2019t fully support.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"594e\">In-Between Menu with Angular Scaling<\/h2>\n\n\n\n<p id=\"7723\">My solution was to place the text property menu <strong><em>between<\/em> the target object and the user\u2019s eyes<\/strong> \u2014 positioned closer to the text to minimize depth-switching. After iterating on distance values, the optimal placement was approximately <strong>30% farther from the object (or 70% from the headset)<\/strong>. This allowed comfortable direct hand interaction in the near-field while maintaining readability for far-field objects. The same positioning and scale also remain usable within the smaller field of view of the first-generation HoloLens.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1050\" height=\"553\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_6Ub2gkQPj-tDUkvBcjzy_w.png\" alt=\"\" class=\"wp-image-4772\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_6Ub2gkQPj-tDUkvBcjzy_w.png 1050w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_6Ub2gkQPj-tDUkvBcjzy_w-300x158.png 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_6Ub2gkQPj-tDUkvBcjzy_w-1024x539.png 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_6Ub2gkQPj-tDUkvBcjzy_w-768x404.png 768w\" sizes=\"auto, (max-width: 1050px) 100vw, 1050px\" \/><figcaption class=\"wp-element-caption\">Minimized focal depth switching between the target object and the menu. The menu automatically scales up\/down to maintain the target size based on the distance.<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/Menu_AngularScaling.mp4\"><\/video><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>MRTK already includes a set of spatial Solvers for world-anchored UI, and fortunately one of them \u2014 the <strong>InBetween Solver<\/strong> \u2014 provided exactly the positioning pattern I needed. It enables UI to sit between two objects, with a tunable percentage that determines how close it is to either side. To preserve ergonomic accessibility, I paired it with the <strong>ConstantViewSize Solver<\/strong>, which keeps the menu\u2019s visual footprint consistent at any depth. The result is a menu that expands when positioned farther away and compresses when closer to the user, enabling smooth transition between interaction modalities: direct touch in near-field and hand-ray or air-pinch when farther away.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"5fff\">UX Elements: Main Menu for the global features<\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1050\" height=\"437\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_n7I98WdsLHgPUqWqOJRnRA.png\" alt=\"\" class=\"wp-image-4777\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_n7I98WdsLHgPUqWqOJRnRA.png 1050w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_n7I98WdsLHgPUqWqOJRnRA-300x125.png 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_n7I98WdsLHgPUqWqOJRnRA-1024x426.png 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_n7I98WdsLHgPUqWqOJRnRA-768x320.png 768w\" sizes=\"auto, (max-width: 1050px) 100vw, 1050px\" \/><figcaption class=\"wp-element-caption\">Main Menu<\/figcaption><\/figure>\n\n\n\n<p id=\"a0df\">For global, non\u2013object-specific actions, I kept those buttons in the hand menu. As described earlier, the menu can be grabbed and pulled out to become world-locked when longer interactions are required.<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/GlobalMainMenu.mp4\"><\/video><\/figure>\n\n\n\n<p id=\"3cf5\">The main menu includes:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>New Text | Clear Scene | Save &amp; Load Scene<\/li>\n\n\n\n<li>Spatial Mapping | Mesh Visualization | Snap to Surface<\/li>\n\n\n\n<li>Physics: Rigid Body | Kinematic | Slider UI for gravity(force)<\/li>\n\n\n\n<li>Grab &amp; Duplicate Toggle<\/li>\n\n\n\n<li>Random Composition<\/li>\n\n\n\n<li>About<\/li>\n<\/ul>\n\n\n\n<p id=\"e7f2\">Below is the latest design iteration \u2014 a more compact version that automatically world-locks when the user drops their hand. Once world-locked, the menu becomes easier to use for multi-step adjustments and interaction with multiple UI controls.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-rich is-provider-twitter wp-block-embed-twitter\"><div class=\"wp-block-embed__wrapper\">\n<div class=\"embed-twitter\"><blockquote class=\"twitter-tweet\" data-width=\"550\" data-dnt=\"true\"><p lang=\"en\" dir=\"ltr\">Hand menu iteration &#8211; switching. <a href=\"https:\/\/twitter.com\/hashtag\/TypeInSpace?src=hash&amp;ref_src=twsrc%5Etfw\">#TypeInSpace<\/a> for <a href=\"https:\/\/twitter.com\/hashtag\/HoloLens2?src=hash&amp;ref_src=twsrc%5Etfw\">#HoloLens2<\/a><br>Design Story on Medium: <a href=\"https:\/\/t.co\/Dr0A2VS7Ug\">https:\/\/t.co\/Dr0A2VS7Ug<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/MadeWithMRTK?src=hash&amp;ref_src=twsrc%5Etfw\">#MadeWithMRTK<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/MRTK?src=hash&amp;ref_src=twsrc%5Etfw\">#MRTK<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/MixedReality?src=hash&amp;ref_src=twsrc%5Etfw\">#MixedReality<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/AugmentedReality?src=hash&amp;ref_src=twsrc%5Etfw\">#AugmentedReality<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/VirtualReality?src=hash&amp;ref_src=twsrc%5Etfw\">#VirtualReality<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/AR?src=hash&amp;ref_src=twsrc%5Etfw\">#AR<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/MR?src=hash&amp;ref_src=twsrc%5Etfw\">#MR<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/VR?src=hash&amp;ref_src=twsrc%5Etfw\">#VR<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/XR?src=hash&amp;ref_src=twsrc%5Etfw\">#XR<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/Typography?src=hash&amp;ref_src=twsrc%5Etfw\">#Typography<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/UX?src=hash&amp;ref_src=twsrc%5Etfw\">#UX<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/Design?src=hash&amp;ref_src=twsrc%5Etfw\">#Design<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/Unity?src=hash&amp;ref_src=twsrc%5Etfw\">#Unity<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/Unity3d?src=hash&amp;ref_src=twsrc%5Etfw\">#Unity3d<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/MadeWithUnity?src=hash&amp;ref_src=twsrc%5Etfw\">#MadeWithUnity<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/HoloLens?src=hash&amp;ref_src=twsrc%5Etfw\">#HoloLens<\/a> <a href=\"https:\/\/t.co\/ar2Iz271hT\">pic.twitter.com\/ar2Iz271hT<\/a><\/p>&mdash; Yoon Park (\ubc15\ub3d9\uc724) (@cre8ivepark) <a href=\"https:\/\/twitter.com\/cre8ivepark\/status\/1230702599759753216?ref_src=twsrc%5Etfw\">February 21, 2020<\/a><\/blockquote><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/div>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-rich is-provider-twitter wp-block-embed-twitter\"><div class=\"wp-block-embed__wrapper\">\n<div class=\"embed-twitter\"><blockquote class=\"twitter-tweet\" data-width=\"550\" data-dnt=\"true\"><p lang=\"en\" dir=\"ltr\">Weekend experiments: elastic menu using <a href=\"https:\/\/twitter.com\/hashtag\/MRTK?src=hash&amp;ref_src=twsrc%5Etfw\">#MRTK<\/a>&#39;s elastic systems(<a href=\"https:\/\/t.co\/qE8yuN9pqV\">https:\/\/t.co\/qE8yuN9pqV<\/a>) contributed by Finn Sinclair (<a href=\"https:\/\/t.co\/kYg8rdOwZo\">https:\/\/t.co\/kYg8rdOwZo<\/a>) <a href=\"https:\/\/t.co\/3wfI4euUB6\">https:\/\/t.co\/3wfI4euUB6<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/HoloLens2?src=hash&amp;ref_src=twsrc%5Etfw\">#HoloLens2<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/MixedReality?src=hash&amp;ref_src=twsrc%5Etfw\">#MixedReality<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/AugmentedReality?src=hash&amp;ref_src=twsrc%5Etfw\">#AugmentedReality<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/VirtualReality?src=hash&amp;ref_src=twsrc%5Etfw\">#VirtualReality<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/Unity?src=hash&amp;ref_src=twsrc%5Etfw\">#Unity<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/SpatialInteraction?src=hash&amp;ref_src=twsrc%5Etfw\">#SpatialInteraction<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/AR?src=hash&amp;ref_src=twsrc%5Etfw\">#AR<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/VR?src=hash&amp;ref_src=twsrc%5Etfw\">#VR<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/XR?src=hash&amp;ref_src=twsrc%5Etfw\">#XR<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/uxdesign?src=hash&amp;ref_src=twsrc%5Etfw\">#uxdesign<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/UX?src=hash&amp;ref_src=twsrc%5Etfw\">#UX<\/a> <a href=\"https:\/\/t.co\/z8vAmg630L\">pic.twitter.com\/z8vAmg630L<\/a><\/p>&mdash; Yoon Park (\ubc15\ub3d9\uc724) (@cre8ivepark) <a href=\"https:\/\/twitter.com\/cre8ivepark\/status\/1290058124800299010?ref_src=twsrc%5Etfw\">August 2, 2020<\/a><\/blockquote><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/div>\n<\/div><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"7c99\">UX Elements: Annotation<\/h2>\n\n\n\n<p id=\"4bb1\">One powerful aspect of spatial text is its ability to annotate physical objects \u2014 labels, notes, or explanations that live in the space where they are relevant. To reinforce this connection, I introduced an optional visual linking system consisting of a sphere and a connector line. The user can reposition both the text anchor and the sphere to create clear, meaningful annotation layouts. The implementation leverages MRTK\u2019s Tooltip component, which provides a robust base for line-based UI relationships.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Type In Space for HoloLens 2 - Annotation\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/e8nchhuqFrw?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"c5e8\">UX Elements: Spatial Mapping &amp; Surface Magnetism<\/h2>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Type In Space for HoloLens 2 - Snap to Surface\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/NRzO1yfxppk?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p>Spatial Mapping is one of HoloLens\u2019 most compelling capabilities \u2014 it enables holograms to understand and interact with real-world surfaces. In the original version of <em>Type In Space<\/em>, text objects were placed on physical surfaces using a Gaze cursor. With HoloLens 2, I updated this to use the hand-ray endpoint, allowing users to attach and move text directly along surfaces with far-interaction input. Placement and movement are toggled via air-tap.<\/p>\n\n\n\n<p>In MRTK, Spatial Mapping is exposed through the <strong>Spatial Awareness<\/strong> system, which provides the environment mesh. To support snapping behavior, I used the <strong>Surface Magnetism<\/strong> solver, allowing text objects to automatically adhere to detected surfaces.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"5c8e\">UX Elements: Physics<\/h2>\n\n\n\n<p id=\"8f3c\">With Spatial Mapping providing real physical surfaces, I explored adding physics-based behavior to the text. In addition to the basic gravity toggle from the original version, the new iteration introduces a slider that lets users control both the magnitude and direction of the force\u2014allowing text to fall, float, or collide dynamically with real-world objects.<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/Physics.mp4\"><\/video><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"5e29\">UX Elements: Text Input with keyboard and speech<\/h2>\n\n\n\n<p id=\"45f0\">HoloLens 2\u2019s direct hand interaction dramatically improves text entry as well. Instead of relying on indirect gestures or head gaze selection, users can now type with their fingers on a holographic keyboard \u2014 much like a physical keyboard, but in mid-air.<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/TypeInSpace_HLKeyboard.mp4\"><\/video><\/figure>\n\n\n\n<p id=\"45f0\">Of course, speech remains available for dictation. The system keyboard includes a built-in speech input button, and MRTK also provides examples for using both the system keyboard and speech input in applications.<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 854 \/ 480;\" width=\"854\" controls src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2024\/06\/TypeInSpace_SpeechInput.mp4\"><\/video><\/figure>\n\n\n\n<p id=\"91ac\">Below is an example of dictation input, powered by the Windows speech service.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-rich is-provider-twitter wp-block-embed-twitter\"><div class=\"wp-block-embed__wrapper\">\n<div class=\"embed-twitter\"><blockquote class=\"twitter-tweet\" data-width=\"550\" data-dnt=\"true\"><p lang=\"en\" dir=\"ltr\">Testing faster new text creation with dictation Input, using <a href=\"https:\/\/twitter.com\/hashtag\/MRTK?src=hash&amp;ref_src=twsrc%5Etfw\">#MRTK<\/a>&#39;s Windows Dictation. Customized with a visual state indicator. <a href=\"https:\/\/twitter.com\/hashtag\/TypeInSpace?src=hash&amp;ref_src=twsrc%5Etfw\">#TypeInSpace<\/a> for <a href=\"https:\/\/twitter.com\/hashtag\/HoloLens2?src=hash&amp;ref_src=twsrc%5Etfw\">#HoloLens2<\/a><br>Design Story on Medium: <a href=\"https:\/\/t.co\/Dr0A2VS7Ug\">https:\/\/t.co\/Dr0A2VS7Ug<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/Typography?src=hash&amp;ref_src=twsrc%5Etfw\">#Typography<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/MixedReality?src=hash&amp;ref_src=twsrc%5Etfw\">#MixedReality<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/AugmentedReality?src=hash&amp;ref_src=twsrc%5Etfw\">#AugmentedReality<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/VirtualReality?src=hash&amp;ref_src=twsrc%5Etfw\">#VirtualReality<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/Unity?src=hash&amp;ref_src=twsrc%5Etfw\">#Unity<\/a> <a href=\"https:\/\/twitter.com\/hashtag\/AR?src=hash&amp;ref_src=twsrc%5Etfw\">#AR<\/a> <a href=\"https:\/\/t.co\/e840DEa9eB\">pic.twitter.com\/e840DEa9eB<\/a><\/p>&mdash; Yoon Park (\ubc15\ub3d9\uc724) (@cre8ivepark) <a href=\"https:\/\/twitter.com\/cre8ivepark\/status\/1236883735196258304?ref_src=twsrc%5Etfw\">March 9, 2020<\/a><\/blockquote><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/div>\n<\/div><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"51a1\">UX Elements: Grab &amp; Duplicate<\/h2>\n\n\n\n<p id=\"eed1\">In the original version, duplication was handled through a simple \u201cduplicate\u201d button, which created a new text object with identical properties and placed it at a small offset. This resulted in a visually interesting array-style effect.<\/p>\n\n\n\n<p id=\"0268\">In the latest iteration, I redesigned duplication to be far more natural: users can simply grab a text object and pull to create a copy. This direct manipulation approach feels intuitive and fluid \u2014 and when repeated, it produces beautiful, world-locked trails of holographic text.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"5303\">Supporting HoloLens 1st gen and Windows Mixed Reality VR devices<\/h2>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1050\" height=\"590\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_G1_wJDcLOcr4oP0EAeEmtQ.jpeg\" alt=\"\" class=\"wp-image-4774\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_G1_wJDcLOcr4oP0EAeEmtQ.jpeg 1050w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_G1_wJDcLOcr4oP0EAeEmtQ-300x169.jpeg 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_G1_wJDcLOcr4oP0EAeEmtQ-1024x575.jpeg 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_G1_wJDcLOcr4oP0EAeEmtQ-768x432.jpeg 768w\" sizes=\"auto, (max-width: 1050px) 100vw, 1050px\" \/><figcaption class=\"wp-element-caption\">Layout example<\/figcaption><\/figure>\n\n\n\n<p id=\"90a1\">A key benefit of using MRTK is its cross-platform support. Its interaction building blocks and UI components work across input systems, from HoloLens 1\u2019s GGV (Gaze, Gesture, Voice) to Windows Mixed Reality headsets with motion controllers.<br><\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1050\" height=\"227\" src=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_Pc9ENCYk0PiioRB9Oz0tZQ.jpeg\" alt=\"\" class=\"wp-image-4778\" srcset=\"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_Pc9ENCYk0PiioRB9Oz0tZQ.jpeg 1050w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_Pc9ENCYk0PiioRB9Oz0tZQ-300x65.jpeg 300w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_Pc9ENCYk0PiioRB9Oz0tZQ-1024x221.jpeg 1024w, https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/1_Pc9ENCYk0PiioRB9Oz0tZQ-768x166.jpeg 768w\" sizes=\"auto, (max-width: 1050px) 100vw, 1050px\" \/><\/figure>\n\n\n\n<p id=\"4877\">The text-property menu (the In-Between menu) required no changes to function on HoloLens 1 \u2014 its fixed placement relative to the selected text keeps it visible even within the device\u2019s smaller FOV, and motion-controller pointer input in VR also works seamlessly.<\/p>\n\n\n\n<p id=\"a3f3\">Because hand tracking is not available on HoloLens 1, I converted the Main Menu into a floating tag-along menu with a pin\/unpin toggle. Aside from that minor adjustment, the app shipped to both HoloLens 1 and HoloLens 2 from a single Unity project.<\/p>\n\n\n\n<p><\/p>","protected":false},"excerpt":{"rendered":"<p>Background As a designer passionate about typography and spatial computing, I\u2019ve long been captivated by [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":4656,"comment_status":"closed","ping_status":"open","sticky":true,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[31,21],"tags":[],"class_list":["post-4469","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-articles","category-projects"],"aioseo_notices":[],"jetpack_featured_media_url":"https:\/\/mixedrealitynow.com\/wp-content\/uploads\/2022\/12\/TypeInSpace2Hero_1920x1080.jpg","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/posts\/4469","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/comments?post=4469"}],"version-history":[{"count":16,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/posts\/4469\/revisions"}],"predecessor-version":[{"id":5509,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/posts\/4469\/revisions\/5509"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/media\/4656"}],"wp:attachment":[{"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/media?parent=4469"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/categories?post=4469"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/mixedrealitynow.com\/ko\/wp-json\/wp\/v2\/tags?post=4469"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}