This is my personal experimental project for HoloLens 2, made with MRTK. This project explores how volumetric data can be represented in physical space with direct hand-tracking inputs in mixed reality using HoloLens 2. It is based on Microsoft’s MRTK(Microsoft’s Mixed Reality Toolkit)
The project works with HoloLens 1st gen as well. (use static menu instead of hand menu)
I have published the Unity project on GitHub:
ESRI’s Coronavirus COVID-19 Cases feature layer
DataVisualizer.cs script contains the code for retrieving, parsing JSON data, and visualizing with graphs. Graph el
DataVisualizer.cs script contains the code for retrieving, parsing JSON data, and visualizing with graphs. Graph elements are added to GraphContainerConfirmed, GraphContainerRecovered, GraphContainerFatal, and LabelContainer. CreateMeshes() creates the graph for three data values and text label. Main menu’s Radio buttons simply show/hide GraphContainer objects.
The menu stays around the user with tag-along behavior which is provided by MRTK’s RadialView solver. Using the pin button, you can toggle tag-along behavior. The menu’s backplate can be grabbed and moved. Grabbing and moving the menu automatically disables the tag-along and makes the menu world-locked.
The menu’s toggle button shows & hides GraphContainer and LabelContainer. Use slider UI for configuring the earth rendering options.
- Earth’s color saturation level
- Cloud opacity
- Sea color saturation level
I was able to create this project using several amazing open-source projects:
- MRTK(Mixed Reality Toolkit) (http://aka.ms/MRTK)
- Unity3DGlobe (https://github.com/Dandarawy/Unity3D-Globe)
- Moon and Earth (https://github.com/keijiro/MoonAndEarth)
- SimpleJSON (https://github.com/Bunny83/SimpleJSON)
- Near interactions with direct grab/move/rotate (one or two-handed)
- Far interactions using hand ray (one or two-handed)
- Main menu to switch the data, change earth rendering options
- Toggle graph, text label
- Toggle auto-rotate
- Data normalization & polish needed
- Sometimes two-handed manipulation makes the globe tiny and not properly scalable. Use hand-ray to make it bigger again.
Mixed Reality blends computer-generated digital content with the real-world physical environment. With advancements in technology, […]
Augmented reality (AR) is a technology that allows users to see and interact with digital […]
Virtual reality (VR) is a computer-generated simulation of a three-dimensional environment that can be interacted […]
With the support of OpenXR, it is now easy to use MRTK with Meta Quest […]
Hand tracking input is a technology that allows a computer or device to track the […]
6DOF stands for six degrees of freedom, and it refers to the ability of an […]
Simultaneous Localization and Mapping (SLAM) is a technology that allows a device, such as a […]