Do you feel that? If you don’t, then your VR game is lacking something vital! We’re talking about haptics and how to implement them!
VR is meant to be a completely immersive experience and if you’re not notifying the player of things happening in the game through haptics, then you’re leaving out one of the easiest ways to enhance how immersive your game can be.
A free VR template can be found here!
Types of Haptic Feedback
For simplicity sake, I’ve divided out haptic feedback into three sections.
- Direct Interactor Haptic Feedback – This is done through our Direct Interactor component and is perfect for when our users are touching or able to interact with objects in our game.
- Ray Interactor Haptic Feedback – We can use the Ray Interactor component to trigger these and are usually associated with UI interactions.
- Scriptable Haptic Feedback – Not all haptic events will be through our Ray or Direct Interactors and scripting will be required to trigger this type of event.
In this tutorial, I’ll go over how we can use each of these types of Haptic Feedback in order to full immerse our players into the game.
Direct Interactor and Ray Haptic Feedback
This is probably the most straightforward application of haptic feedback that we can give our users. When the player is going to interact with an object, the Direct Interactor can send a haptic signal with a certain intensity and duration to the controller.
The Ray Interactor has the exact same events available as well for haptic feedback. If you interact with objects like Grab Interactables from afar, these will trigger the haptic events.
Let’s look at the different Haptic Events for the XR Direct Interactor.
Select Entered and Select Exited
These events are pivotal in XR applications for interacting with virtual objects. “Select Entered” is triggered when a user initiates an interaction with an object, for example, by pressing a button on a handheld controller while pointing at the object. This event often starts the process of grabbing or using the item. Conversely, “Select Exited” occurs when the interaction is ended, such as releasing the button, signifying the object is no longer being directly interacted with.
Hover Entered and Hover Exited
“Hover Entered” and “Hover Exited” are similar to the select events but are triggered by the mere presence of an interaction pointer (like a hand or a controller) near an interactable object, without requiring a press or grab action. “Hover Entered” happens when an object first becomes the focus of the user’s interaction pointer, useful for highlighting or indicating that an object can be interacted with. “Hover Exited” occurs when the interaction pointer moves away, and the object is no longer the immediate focus of interaction, often used to remove highlights or indicators.
Hover Canceled
This event might be less commonly referenced but implies an interruption in the hover state that’s not a natural transition out of it. For example, if the system suddenly loses tracking of the controller or hand, or if another system event takes precedence, requiring the hover interaction to be prematurely terminated.
I should also mention some settings available to us.
- Intensity – This will control how aggressive the vibration is for our controllers. Typically, you’ll want a lower intensity when hovering and a higher intensity when selecting something.
- Duration – This is the length of the haptic in seconds.
Script Haptic Feedback
These can be used for both Grab Interactable, Menu Interactions or whatever interaction your would like! For this tutorial, let’s have haptic events for Menu Interactions, since we’ve already done Grab Interactables.
Let’s start by changing our Ray Interactors to only interact with UI components. We just need to change the Interaction Layer Mask to UI.
Just went you thought it would be easy, the Unity XR Interaction Toolkit doesn’t provide any Interactables for our UI components that can trigger the haptic events. In order to do it, we’re going to have to dive into some code.
Before we do that, I’m going to have you make a simple UI canvas in world space and add a button to it. There are plenty of tutorials outside of this one that can walk you through this process, but I’ve posted an image of mine as an example.
With your menu and button set, we’re going to want to add a new script to the button. This script can be used for any UI element that we want to produce haptic events.
Call the script XRUIHapticFeedback and add it to your button now!
HapticSettings Class
- Defined as
[System.Serializable]
, allowing it to be visible and configurable in the Unity Inspector. This class holds settings related to haptic feedback, including:active
: A boolean indicating if haptic feedback should be given.intensity
: A float representing the strength of the haptic feedback, constrained between 0 and 1.duration
: A float indicating how long the haptic feedback lasts.
XRUIHapticFeedback Class
Implements several interfaces from UnityEngine.EventSystems
to react to pointer events:
IPointerEnterHandler
,IPointerExitHandler
: These handle the cursor (or ray in VR) entering or leaving a UI element, respectively.IPointerDownHandler
,IPointerUpHandler
: These handle pressing and releasing a click or selection action on a UI element.
Public HapticSettings Instances
Defines specific HapticSettings
for different interaction events:
OnHoverEnter
,OnHoverExit
,OnSelectEnter
,OnSelectExit
: Each of these properties can be individually configured in the Unity Inspector to control haptic feedback for hover and select actions.
InputModule Property
- A property that dynamically casts the current input module to
XRUIInputModule
, enabling access to XR-specific input handling. This casting is necessary to work with the XR Interaction Toolkit’s input system, allowing the script to retrieve the interactor (e.g., XR controller) associated with a given UI event.
Event Handling Methods
- Methods like
OnPointerEnter
,OnPointerExit
,OnPointerDown
, andOnPointerUp
are implemented to respond to their respective UI events. Each method checks if the correspondingHapticSettings
is active and, if so, callsTriggerHaptic
with the event data and settings.
TriggerHaptic Method
- A private method that takes
PointerEventData
andHapticSettings
as arguments. It attempts to find theXRRayInteractor
associated with the event, and if successful, uses it to send a haptic impulse based on the provided settings.
What we’ve essentially done with this script is map out normal UI events, like PointerEnter and PointerExit, and mapped them out to our own versions of OnHoverEnter and OnHoverExit. Whenever the events happen, we ask the EventSystem (and more specifically the XRUIInputModule) which controller performed the action, and send a vibrating impulse to the correct controller! Simple!
Conclusion
We could do more scripting and more interactions, but I think you have a solid idea now of haptic events and how to use them.
I hope it helped, and I’ll see you in the next one!