How To : Unity VR Basics 2023 – Input System and Hands (XR Interaction Toolkit)

In this tutorial, we’ll be learning how to use Unity’s Action Input System to connect our VR Hands. I’ll express the importance of really understanding the Action Input System. With it, you’ll be able to connect hand animations with player input, but more importantly, you’ll be able to get Input from your player for future actions as well.

The hands that I’ll be using for this tutorial are provided by Oculus and can be found here

Importing Oculus Hands

Once we’ve downloaded the Unity Package, we just need to import it into our project. This is as easy as locating the package and dragging it into our project.

Going over the contents, let’s see what it comes with. 

  • Animations – This folder contains our Animation Controllers for our left and right hand as well as a few animations that we’ll be using later.
  • Material – A single hand material that is used for both hands
  • Models – The original models for the hands.
  • Prefabs – The prefabs for our left and right hand. Each one already has a Animator attached to it. 
  • Practice – This is a folder with a duplicate hand prefabs so we can recreate a full working animated VR hand.
With that, let’s get our VR hands working!

Spawning Our Hands

Keeping things simple, we’re going to start by spawning our hands.

  • Select the LeftHand Controller found on the XR Origin
    • Find the XR Controller Component
      • At the bottom of the component, add the LeftHand Prefab to the empty Model Prefab slot

 

  • Select the RightHand Controller found on the XR Origin
    • Find the XR Controller Component
      • At the bottom of the component, add the RightHand Prefab to the empty Model Prefab slot

Now when we boot up the Scene, our hands will spawn and follow our controllers!

Animating Our Hands

In order to get our hands animating, we’re going to need FOUR things.

  • Animation Controller – The Animation Controller is typically used in conjunction with the Animator component, which is responsible for applying animations to a GameObject. The Animator component references an Animation Controller asset, which defines the various animation states and transitions.
  • Animator – The Animator is a component that controls the playback and blending of animations for a character or object. It is a key component of Unity’s animation system and works in conjunction with the Animation Controller.
  • Animation Clips – An Animation Clip is a container for defining the keyframes and properties that make up an animation. It contains information about how a specific property should change over time. For example, you can create an Animation Clip to define how a character’s arm moves during an attack or how an object rotates. 
  • Script – We will need at least one script that will be reading using Input from controllers and translating that for the Animator.

Animator, Animation Controller and Animation Clips

These three are where we’re going to start. As you can see, I’ve already provided completed version of these that we’ll be using. For practice, I advise trying to replicate all of these so you’ll be able to do it in the future. 

The only portion that I will not cover in this tutorial is Animation Clips. I’ve covered them before, but it seemed to be a bit too much to fit into a singular tutorial like this one. 

If we select one of the Hand prefabs, we can see an Animator attached to it. This animator already has the corresponding Animation Controller needed. It also uses an Avatar that is constructed using the model given to us by Oculus.

Going to Unity VR Basic 2023 Hands -> Oculus Hands -> Animations, we can double click the LeftHand Controller to open up the Animator window.

Inside there is already a Blend Tree that I’ve created. Below I will explain how everything would work if we were to create a new Animation Controller from scratch.

If we wanted to create on from scratch with a new Animation Controller, we would right click in the window and go to Create State -> From New Blend Tree.

Double Click the Blend Tree that was created and this should open up.

 

In the top left, click on Parameters and click the plus sign to add a new float naming it Grip. Add another float and name it Trigger. These two values will be used to store the values of our grip and trigger buttons from our controllers.

With the Blend Tree still selected, look for the drop down menu in the inspector that says 1D and choose 2D Freeform Cartesian instead.

Still in the Inspector, for the first Parameter choose Grip and for the second choose Trigger.

Under Parameters, click the plus sign and choose Add Motion Field. Do this until you have 4 of them. Give them the following attributes in this order. (0,0) , (0,1), (1, 0), (1, 1). Now your Inspector should look like this. 

These four state represent the four states that our buttons could be in. Those being none pressed, one pressed, the other pressed and both pressed.

Next we need to attach animations to the empty Motion slots. 0,0 means nothing is pressed so we’ll put Default there. 0,1 is the trigger pulled so we’ll use the Pinch animation. 1,0 is the grip button so the fist will work. Last is 1,1 which we’ll also use fist for. The end results should be this.

 

With that, the three main components are now ready to be fed information from our controllers. The last step is to use a script to tie it all together.

Before I get into the script, I want to give you this bonus section explaining the Input Action System by Unity. If you’re not familiar, I suggest reading the next section as well as experimenting with it on your own so you can understand it better.

UNITY'S ACTION INPUT SYSTEM

In 2018, Unity introduced the Action Input System, which has since become the standard Input System for VR development. Unlike other systems, it operates on events, meaning it triggers actions only when the corresponding button is pressed. It is also action-based, allowing multiple input devices or keys to be mapped to a single action. This eliminates the need to write code for each specific controller available in the VR market.

To provide a clearer understanding of the Input System, let’s walk through a simple example. Imagine we want to implement movement for a 2D game using the Action Input System.

To begin, while in the Assets folder, follow these steps: right click -> create -> Input Actions.

Name it PracticeControls and double click it. It will bring up a window called an Action Map.

 

Tip – in the top right, click on Auto-Save. This isn’t 100% necessary, but I’ve lost a lot of work not doing this in the past.To ensure your progress is saved, go to the top right corner and click on ‘Auto-Save’. Although not essential, neglecting this step has caused me to lose a significant amount of work in the past.

First, let’s create an Action Map. Click on the plus sign and give it the name ‘Default’. Action maps serve as a way to group actions that handle player input.

Next, add an action by clicking on the plus sign in the center. Name this action ‘Move’. When you click on the action, you’ll notice that it has Action Properties which can be customized based on the desired functionality of the action.

For this example, we’re going to focus on Button and Value. Button will simply determine if a button is pressed or not. Value will return a value depending on what control type we give it. For now, select Value and for Control Type, select Vector 2.

Now under Move, delete the  <No Binding>. Click the plus arrow on Move and choose “add a Up/Down/Left/Right Composite”.

Click “UP” and then click “Path” on the right hand side.  Type “W” in the search bar and select W [keyboard] . Repeat this for Down, Left and Right using S, A, D characters.

Click the plus associated with move again and select “add a Up/Down/Left/Right Composite” again. Repeat the steps from before and use the arrow keys as inputs instead.

Just like that, we have multiple mappings for the same Action for our game. Whenever one of those buttons is pressed, it will send back values that we can use in our code for movement.

We could continue to dive deeper into this system, but I think we have a good understanding of how it works.

Now how do we tie this into VR?

XRI Default Input Actions

Mapping out ever input button from our VR devices would be pretty tedious. Luckily, we already have the XRI Default Input Actions. To find this navigate to Samples->XR Interaction Toolkit -> 2.3.XX – > Default Input Actions.

 

This is just a more complex version of the simple movement system created in the previous section. I’ve brought this up because this is where we will be getting our controller information from. We will need to reference inputs from this Action Map in our scripts.

Selecting XRI LeftHand Interaction, we can see a few Actions available to us. 

 

There is both Select and Select Value. The big difference between these two is that Select Value uses a Value type that will give us a range between 0 and 1 of how much the button is pressed. Select uses a Button action type that would only give us a 0 or 1.

Since we’re transitioning between different phases of the animation and don’t want it to just snap into place, we need to use Select Value. 

We’re also going to use Activate Value to grab our trigger value. Since this is our index finger grabbing this trigger, it will function to perform our pinch animation.

Let’s connect this with some code.

Scripting Glue

Let’s get to scripting! This tutorials gone on long enough!

  • Select the LeftHand prefab
    • Add Component – Script and call it ‘AnimateHandController’

 

 

This is the script I came up with to use our input data.

  • RequireComponent – This will simply force the Prefab to have an Animator component.
  • InputActionReference – This is a reference for both our grip and trigger input actions.
  • Start() – The start function simply grabs and retrieves the Animator component.
  • Update() – This is called every frame and will update the values for the Grip and Trigger.
  • AnimateGrip/Trigger() – These functions go into the action of the Input Action Reference and pulls out the current float value. It then set’s that float value for the Animator.

Last we just need to wire the InputActionReferences in the editor.

In those empty slots, we just need to put the XRI LeftHand Interaction/SelectValue in the GripValue slot and the XRI LeftHand Interaction/ActivateValue in the TriggerValue slot.

Once those are completed, we just need to add the script to the Right Hand prefabs and attach the corresponding Input Action References, and we should be set!

Conclusion

Booting up the scene, we should now have hands that animated when we press our grip and trigger buttons!