Building app with Spatial Stylus Input Device for Quest – Logitech MX Ink

As a designer, being able to put a brush stroke on a three-dimensional canvas was one of the most exciting experiences I had when I first tried Virtual Reality. I became a huge fan of spatial painting apps like TiltBrush, spending hours sketching large-scale pieces.

However, one of the biggest challenges was using bulky controllers for extended periods. It was difficult to create smaller, precise sketches with the large motion controllers. I’ve often thought how incredible it would be if a stylus input device could be integrated into VR.

Some of my paintings with old bulky controllers with HTC Vive and Windows MR devices

Star Wars VR painting with Tilt Brush
Received Steam Community Award : )

Dream has Come True: Spatial Stylus Input Device for Meta Quest

Finally, the dream has come true. Logitech has introduced a new stylus device MX Ink that’s spatially tracked just like 6DoF controllers, allowing you to perform spatial drawing in 3D space with a much more familiar and comfortable pen-like form factor, instead of bulky controllers.

In addition to its precision and spatial tracking, the stylus offers exciting features—a pressure-sensitive tip and button. With the pressure-sensitive middle button, you can control the thickness of brush strokes in real time.

Start Building with MX Ink Sample App

Logitech’s Github page provides comprehensive resources and tools for building apps with MX Ink.

MX Ink Sample App provides a sample scene that demonstrates how to create a simple spatial drawing experience with stroke thickness controls. The project includes Meta XR Core SDK v68 through pacakge manager which support MX Ink.

Inside SampleScene.untiy scene, you can find MX_Ink prefab and basic Camera Rig with Passthrough capability configured through Building Blocks.

As described in the MX Ink Unity Integration documentation, you need to assign MxInkActions.asset file in the Edit > Project Settings > Meta XR > Input Actions to see the stylus device model. Otherwise, it will show Quest controllers.

MxInkActions.asset file shows the definition of actions that MX Ink device supports. To learn more about Input Actions, check out this documentation.

Brush Stroke Thickness with Pressure Sensitive Button

When you run the SampleScene.unity, you can see varying line thickness based on the analog value of the Middle button. It just works like the analog trigger button of the controllers.

Mixed Reality + Pressure Senstive Tip

What does it mean to have pressure sensitive tip? By combining with Quest’s Passthrough capability, you can draw virtual strokes on the physical surfaces. In addition to it, since Quest’s Meta Horizon OS provides sophisticated Scene Understanding and environmental awareness, you can leverage physical surfaces in fully immersive experiences as well. Check out my other post ‘Building MR apps using physical surfaces has never been easier! How to use Meta Mixed Reality Utility Kit (MRUK)‘ to learn more about Scene Understanding.

Drawing prefab object includes LineDrawing.cs script which demonstrates how to create lines with Unity Line Renderer based on the input values.


float analogInput = 
Mathf.Max(_stylusHandler.CurrentState.tip_value,_stylusHandler.CurrentState.cluster_middle_value);

if (analogInput > 0 && _stylusHandler.CanDraw())
{
    if (!_isDrawing)
    {
        StartNewLine();
        _isDrawing = true;
    }
    AddPoint(_stylusHandler.CurrentState.inkingPose.position, _lineWidthIsFixed ? 1.0f : analogInput);
}

Inside VrStylusHandler.cs script under MX_Ink prefab, you can find how to retrieve the values from the stylus input, which is documented in this page.

  OVRPlugin.GetActionStateFloat("tip", out stylus_tip_value);
  OVRPlugin.GetActionStateBoolean("front", out bool stylus_front_button);
  OVRPlugin.GetActionStateFloat("middle", out stylus_middle_value);
  OVRPlugin.GetActionStateBoolean("back", out bool stylus_back_button)
  OVRPlugin.GetActionStateBoolean("dock", out _stylus_docked)
void Update()
{
    OVRInput.Update();
    UpdatePose();

    if (!OVRPlugin.GetActionStateFloat(MX_Ink_TipForce, out _stylus.tip_value))
    {
        Debug.LogError($"MX_Ink: Error getting action name: {MX_Ink_TipForce}");
    }

    if (!OVRPlugin.GetActionStateFloat(MX_Ink_MiddleForce, out _stylus.cluster_middle_value))
    {
        Debug.LogError($"MX_Ink: Error getting action name: {MX_Ink_TipForce}");
    }

    if (!OVRPlugin.GetActionStateBoolean(MX_Ink_ClusterFront, out _stylus.cluster_front_value))
    {
        Debug.LogError($"MX_Ink: Error getting action name: {MX_Ink_ClusterFront}");
    }
    
    ...

Visualizing brush stroke

Unity’s LineRenderer provides a decent drawing experiences stroke thickness variations based on analog input values. However, due to its tape-like shapes, it feels it is not great for representing stroke of letters when writing on 2D surfaces.

Setting the Alignment property to View makes it little bit better but still not great for surface writing.

Button Visual Feedback

Sample App shows an example of button highlights which can improve the interaction confidence. You can find the code in VrStylusHandler.cs script file. Based on the button values, it changes the button’s material color through MeshRenderer.

        _tip.GetComponent<MeshRenderer>().material.color = _stylus.tip_value > 0 ? active_color : default_color;
        _cluster_front.GetComponent<MeshRenderer>().material.color = _stylus.cluster_front_value ? active_color : default_color;
        _cluster_middle.GetComponent<MeshRenderer>().material.color = _stylus.cluster_middle_value > 0 ? active_color : default_color;
        if (_stylus.cluster_back_value)
        {
            _cluster_back.GetComponent<MeshRenderer>().material.color = _stylus.cluster_back_value ? active_color : default_color;
        }
        else
        {
            _cluster_back.GetComponent<MeshRenderer>().material.color = _stylus.cluster_back_double_tap_value ? double_tap_active_color : default_color;
        }

Haptic Feedback

MX Ink Sample App also includes an example of haptic feedback. In VrStylusHandler.cs you can find it under PlayHapticClick() functions which uses OVRPlugin.TriggerVibrationAction()

    private void PlayHapticClick(float analogValue, ref bool hasVibrated, OVRPlugin.Hand hand)
    {
        if (analogValue >= _hapticClickMinThreshold)
        {
            if (!hasVibrated)
            {
                OVRPlugin.TriggerVibrationAction(MX_Ink_Haptic_Pulse, hand,
                _hapticClickDuration, _hapticClickAmplitude);
                hasVibrated = true;
            }
        }
        if (analogValue < _hapticClickMinThreshold)
        {
            hasVibrated = false;
        }
    }

Adding Interactions with Meta XR Interaction SDK

In addition to drawing, your app probably needs various spatial interactions such as grabbing and moving objects or interacting with UI. For example, you might want allow the user to interact with UI for color palette or brush types, or grab and manipulate 3D brush strokes. With high precision, stylus could be also used selecting small objects in space.

Using MX Ink with Interaction SDK’s Comprehensive Rig

Meta XR Interaction SDK provides comprehensive set of spatial interactions with various input modality support. Interaction SDK’s OVRCameraRigInteraction.prefab has all input modalities configured in a single prefab including camera. Please check out Interaction SDK’s rich interaction support in this article – How To Create Spatial Interactions with Meta XR Interaction SDK

To use MX Ink with OVRCameraRigInteraction.prefab, you can assign OVRCameraRig > OVRInteractionComprehensive > OVRControllers > Left/RightController to VrStylersHandler script.

If you don’t assign these controller fields, you will see both stylus and controller models rendered together.

Adding Poke, Ray, Grab Interactors to MX Ink Stylus

To enable poke, ray, grab interactions in your experiences, you need to add corresponding Interactors (on input source) and Interactables (on target object). We can do this easily by copying existing one that is already configured for controllers in OVRCameraRigInteraction.prefab.

Duplicate ControllerInteractors object under OVRInteractionComprehensive > OVRControllers > RightController and move them into MX_Ink object.

As you can see, ControllerInteractors object contains all crucial interactors such as Poke, Ray, and Grab interactions. Since I don’t intend to use locomotion, and distance grab, I removed those interactors.

One thing we need to modify is the Selector for Ray and Poke interactors. Since we would like to perform select action with stylus device’s button, we need to provide a new selector script for stylus.

For this, I simply duplicated ControllerSelector.cs, renamed it as StylusSelector.cs and modified the script to retrieve stylus button states and use it as a selector.

In this example, I used _stylusHandler.CurrentState.cluster_front_value to use the front button’s click as selector.

        // Added this line to get VrStylusHandler.cs script
        [SerializeField]
        private StylusHandler _stylusHandler;

        ...
        
        protected virtual void Update()
        {
            // Modified line for stylus front button
            bool selected = _stylusHandler.CurrentState.cluster_front_value;

            if (selected)
            {
                if (_selected) return;
                _selected = true;
                WhenSelected();
            }
            else
            {
                if (!_selected) return;
                _selected = false;
                WhenUnselected();
            }
        }

Added StylusSelector.cs and disabled existing ControllerSelector.cs

Drag and drop selector object again to Ray/Poke Interactor’s Selector field. Make sure to select StylusSelector script.

Now you can see the with Ray Interactor, a ray with cursor coming from the stylus tip. On front button click, the ray performs select action and shows blue cursor visual feedback. Using this, you can interact with UI or manipulate objects. Poke Interactor allows you to directly poke UI elements with the stylus tip. Grab Interactor allows you to directly grab objects.

Manipulating brush stroke objects with Grab Interactor/Interactables

To grab and move brush strokes created with LineRenderer, we need to set LineRenderer’s useWorldSpace property false. I have modified LineDrawing.cs script in Logitech’s MX Ink Sample app. Then I created an empty container and added Interaction SDK’s Grab and Ray Grab Interactors. For newly created lines, I simply add them into this container so that they can be grabbed and manipulated.

    private void StartNewLine()
    {
        var gameObject = new GameObject("line");
        // Add to a container object with Interaction SDK's Grab Interactors
        gameObject.transform.SetParent(lineContainer.transform);
        
        ...

        LineRenderer lineRenderer = gameObject.AddComponent<LineRenderer>();
        
        ...
        // Udated line
        _currentLine.useWorldSpace = false;

Now the new brush strokes are added to a container object and it is grabbable and movable. With GrabFreeTransformer, you can do two-handed scaling and rotating manipulation as well.

Interaction SDK’s GrabFreeTransformer setup for two-handed manipulation. You need to uncheck ‘Transfer On Second Selection’ to allow both controllers/stylus grabbing the object.

Concurrent Tracking – Multimodal Input with Controller and Stylus

Most of the painting or productivity applications require extensive menus for various options and functionalities. Typically they are attached to one of the controllers and interact with it using the controller on the other hand.

The application can support mixed input of controller and stylus allowing the user comfortably interact with controller-attached UI while drawing.

For this example, I modified one of the example menu UI patterns provided in Interaction SDK’s UI Set and attached it to the left controller.

Typical menu UI example that is attached to the left controller for quick and easy access. It works well with Ray and Poke Interactors that have been added to the stylus.

Stylus attached UI

Just like controller-attached UI, we can think about UI elements attached to the stylus device that can provide crucial information or functionality that are frequently accessed. In this example, I attached color palette UI which shows currently selected color. Modified rear button functionality to switch color when pressed.

Customizing the pressure curve for tip and middle button

Users can customize the sensitivity of the stylus tip and middle buttons through Quest’s Settings > Devices > Stylus page. By default, when you put your stylus onto a physical surface, it starts creating brush strokes just like a real pen. However, if you want to make it more firm – meaning, requiring more pressure to start drawing (e.g. to prevent accidental unwanted strokes), you can adjust Initial Activation Force curve.

Testing out different pressure sensitivity settings:

Related Articles

en_USEnglish