Unity Vr Basics 2023 – XR Hands

One of the most immersive experiences we can have in VR is with hand tracking. With the Quest 2 improving it’s hand tracking abilities, it’s time we talk about XR Hands! 

Installing XR Hands

In order to use XR Hands with hand tracking, we’re going to need to install the XR Hands package and also add it to OpenXR in Project Settings.

  • Window -> Package Manager
    • Select ‘Unity Registry’ from the top left
    • Search ‘XR Hands’
    • Install ‘XR Hands’
    • Select XR Hands
      • Select Samples
        • Install HandVisualizer

With the XR Hands package installed, let’s make sure our OpenXR settings include hand tracking.

  • Edit -> Project Settings -> XR Plug-in Management -> OpenXR
    • Under the PC tab
      • Hand Tracking Subsystem = True
      • Meta Hand Tracking Aim = True
    • Under the Android tab
      •  
      • Hand Tracking Subsystem = True
      • Meta Hand Tracking Aim = True
      • Meta Quest Support = True
With that, we’re now ready to implement some hand tracking!

The Hand Visualizer

If are hand are going to appear in the scene, we’re going to need some representation of them. 

  • Expand XR Origin
    • Under Camera Offset
      • Add an Empty Game Object and call it ‘Hand Visualizer’ and select it
        • Add Component ‘Hand Visualizer’
          • XR Origin = XR Origin (From our scene)
          • Left Hand Mesh = LeftHand (Samples -> XR Hands -> 1.1.0 -> HandVisualizer -> Models)
          • Right Hand Mesh = RightHand (Samples -> XR Hands -> 1.1.0 -> HandVisualizer -> Models)
            • DO NOT USE A DIFFERENT MODEL. IT WILL BREAK.
          • Hand Mesh Material = Choose whatever you like
          • Draw Mesh = True
          • Debug Draw Prefab = Joint ( Samples -> XR Hands -> 1.1.0 -> HandVisualizer -> Prefabs)
          • Debug Draw Joints = True
          • Velocity Prefab = VelocityPrefab( Samples -> XR Hands -> 1.1.0 -> HandVisualizer -> Prefabs)
          • Velocity Type = None

With this script, we should be able to boot up the scene and have functioning hand tracking! Just make sure to turn on hand tracking on your device before testing and trying to debugs something for an hour. Not that I did that or anything…

One problem you might notice is that our controller and XR hands are appearing together. It would be a much better user experience if we have our controllers/controller hands disappear when hand tracking is active. Let’s fix that!

XR Input Modality Manager

First we need to update our XR Hands.

  • Select the Camera Offset
    • Add Empty Game Object called ‘XR Hands’ and select it
      • Add Empty Game Object called ‘XR Left Hand’
      • Add Empty Game Object called ‘XR Right Hand’
      • Drag the Hand Visualizer as a child of the XR Hands
 
 

Luckily for us, the Unity XR Toolkit provides a script called the XR Input Modality Manager. This script will switch between controllers and hand tracking based on whatever the VR Device is tracking. 

  • Select the XR Origin
    • Add Component ‘XR Input Modality Manager’
      • Left Hand = XR Left Hand
      • Right Hand = XR Right Hand
      • Left Controller = LeftHand Controller
      • Right Controller = RightHand Controller

With that complete, we should be able to boot up the scene and switch between our hands and controllers!

XR Grabbing

In order to grab objects with our XR Hands, we’ll need to add a XR Direct Interactor like before, but with a few specific steps.

  • Select XR Left Hand
    • Create an empty object called ‘Direct Interactor’ and select it
      • Add Component ‘XR Direct Interactor’
      • Sphere Collider
        • Is Trigger = True
        • Radius = 0.05
      • Add Component ‘XR Controller’
        • in the top right select the middle icon and choose ‘XRI Default Left Controller’
        • For Position Action, choose the reference to be ‘XRI LeftHand/Pinch Position’

 

The reason we chose to put in the Pinch Position as our position for the XR Direct Interactor is because it feels the most natural. When we start to use our XR Hands, it will place the XR Direct interactor right at the tip of our pointer finger. When we go to pinch an object, it will interact with it!

One bonus to make our interactions with Grab Interactables feel right, is to make sure we set Use Dynamic Attach to True. This will allow us to grab objects exactly where we pinch them.

Words of Causion and Conclusion

The XR Hands package is still very new and very very buggy. My personal experience has been pretty frustrating and confusing throughout researching this post. If you’re planning on using this in your project, you may want to consider waiting until it is further developed. 

It’s still amazing and fun to work with in prototypes, but I think it’s still a ways off from being ready for any commercial product.