Unity: Touch Controls

When you’re making a mobile game, this functionality is essential for having your player interact with your game. I don’t want to waste your time, so I’m going to run through some of the examples found in the Unity Manual and explain what they are and how they work.

Side Note: For my game, I actually ended up using functions like OnMouseDown() and OnMouseMove() for touch controls since I only intend the player to move one Object at a time. You can find that article here.

If you need multiple touch controls, this article should help.

Touch Began

Lets start by looking at the code provided by Unity in the manual.

Link-to-that, just if you want to read more.

using UnityEngine;

public class TouchInput : MonoBehaviour
{
    GameObject particle;

    void Update()
    {
        foreach(Touch touch in Input.touches)
        {
            if (touch.phase == TouchPhase.Began)
            {
                // Construct a ray from the current touch coordinates
                Ray ray = Camera.main.ScreenPointToRay(touch.position);
                if (Physics.Raycast(ray))
                {
                    // Create a particle if hit
                    Instantiate(particle, transform.position, 
                                transform.rotation);
                }
            }
        }
    }
}

Notice that it starts off in the Update() function. In order to detect touches and act on them, it needs to be done every frame.

In the above example, the foreach loop is iterating through each touch detected by Unity in that frame. It then checks the state of the current touch as a TuchPhase.Began. This is the first phase of a Touch in Unity when the player first puts their finger on the screen.

From there it uses Camera.main.ScreenPointToRay(touch.position). It does this because the Touch position is given in pixels and the game world operates in units. Using Camera.main.ScreenPointToRay(touch.position) and storing this value in a Ray allows us to store this conversion of pixel to units.

It finishes off by producing a fun little particle effect where we clicked.

Touch Moved & Touch Ended

But wait! There is more in the Unity manual!

    public Vector2 startPos;
    public Vector2 direction;
    public bool directionChosen;
    void Update()
    {
        // Track a single touch as a direction control.
        if (Input.touchCount > 0)
        {
            Touch touch = Input.GetTouch(0);

            // Handle finger movements based on touch phase.
            switch (touch.phase)
            {
                // Record initial touch position.
                case TouchPhase.Began:
                    startPos = touch.position;
                    directionChosen = false;
                    break;

                // Determine direction by comparing the current touch 
                // position with the initial one.
                case TouchPhase.Moved:
                    direction = touch.position - startPos;
                    break;

                // Report that a direction has been chosen when the finger 
                // is lifted.
                case TouchPhase.Ended:
                    directionChosen = true;
                    break;
            }
        }
        if (directionChosen)
        {
            // Something that uses the chosen direction...
        }
    }
}

So in this example, Unity is now using a switch statement to grab the first Touch detected with Input.GetTouch(0).

Note: You can grab different Touches encountered by using different numbers. Input.GetTouch(1) or Input.GetTouch(2) would get the second and third touches for example.

From here, the function determines what phase the first Touch is in.

TouchPhase.Moved : In this phase, Unity has already detected a Touch and now is recording every movement it is taking as it is dragged across the screen. In the example above, it is used to create a direction to be used later by subtracting the start position from the current position.

TouchPhase.Ended : This phase happens once a finger has been lifted. Signaling the end of a fingers interaction with the screen. In the example, it changes a bool to true allowing the function to enter its final ‘if’ statement.

Detecting Objects Touched

You may want to try interacting with Objects in your game.

    private void Update()
    {
        if (Input.touchCount > 0)
        {
            Debug.Log("entered");

            Touch touch = Input.GetTouch(0);
            //Converting touch position to in world space
            Ray raycast = 
                   Camera.main.ScreenPointToRay(Input.GetTouch(0).position);

            RaycastHit raycastHit;
            RaycastHit hitInfo = Physics.Raycast(touchPosWorld, 
                                 Camera.main.transform.forward);
            if(hitInfo != null)
            {
              GameObject touchedObject = hitInfo.transform.gameObject;
              Debug.Log("I touched this " + touchedObject.transform.name;
            }
        }
    }

In the above example, I’m grabbing the first touch input if there is one and converting it’s screen position to in game world space. Once I have that, I then set up a RaycastHit to store the value if our finger touch does make contact with an Object in game.

You can then use hitInfo to grab the Object that was touched and manipulate it how you see fit. In the example, I just print the name, but the world is your oyster!

Wrapping Up

There is still a ton that you can do with touch controls and this is just a surface level overview. Make sure to jump in and play with it yourself. It was tricky for me at first, but after playing with a few fun bugs, it’s easy to figure out.

Cheers! Happy Coding!