As VR developers we almost all need a UI system where the user can look at something and action it, such as a button.

With Unity’s new UGUI (in 4.6 and 5) it finally makes it easy to build great world-space UI for input by mouse. But it doesn’t have anything out of the box for VR because it does not support a look-based input method (look at and object and press a key/button to select), or a gaze-based input method (auto-select after gazing for a period of time). It is left up to VR developers to keep re-inventing the wheel.

There was this great thread New Unity UI + OVR Look-Based Input HOWTO on the Oculus forums that provides an approach to do look tracking with UGUI. It has evolved over the last 6 months to work with more of the built-in UI elements. It enables you to look at an element and select it by a key or button. Unity has published the full source code to their UGUI input system, making it possible to understand how the input modules work.

What I wanted was a small and simple system that would do both look and gaze input and be very quick and easy to add to a project. So I created the GazeInputModule script below. The module can work as either gaze or look based. It leverages off the built-in Pointer Input Module which already has the code for tracking enter/exit on UI.

The way UGUI input works is that you must create an Event System and then attach one or more input modules to it. Unity creates you a game object with an Event System automatically if you add any UI to your scene in the editor. By default Unity creates you a Standalone-Input-Module and a Touch-Input-Module. My new input module should *replace* the existing ones, as we can’t have both the mouse and head tracking the same elements (because which should be active if both the mouse and head are pointing at different elements?).

Our script will use raycasting, for which you need to make sure your Canvas is in world-space, and it has a Graphic Raycaster attached (A canvas created by the editor automatically adds one). (If you have multiple cameras you would also need to drag your VR Camera (or Center Eye Camera) onto your canvas so it knows which camera to raycast from:

canvas

Drag on the Gaze-Input-Module script onto your Event System game object and disable the existing input modules:

inspector

Finally choose Click or Gaze methods.

That’s it! Hit Run.

For Click, look at a button and press theXBox Controller “A” key or press the “Spacebar” on your keyboard.

For Gaze, look at a button for 2 seconds.

To summarize:

1. Drag it onto your EventSystem object.
2. Disable any other Input Modules (eg: StandaloneInputModule & TouchInputModule) as they will fight over selections.
3. Make sure your Canvas is in world space and has a GraphicRaycaster (A canvas created in the Editor gets one automatically).
4. If you have multiple cameras then make sure to drag your center camera into the canvas.

 

Unity 5 First Person Controller Special Warning: 

If you are using the FirstPersonController be aware that it totally breaks Canvas UI ! This is because the mouse code in it resets the camera orientation every frame during Update() for both mouse rotation horizotnal and vertical, and also head bobbing effect.

To fix it: Firstly turn off head bob property in the inspector. Then either disable the mouse code completely in FirstPersonController (comment out RotateView() call in Update()), or alternatively to keep mouse to turn your body, edit the MouseLook.cs file and comment out the 2 places it calls camera.localRotation in LookRotation() file. (Another alternative for a super quick fix is to rename Update() to be LateUpdate() so the Canvas UI raycasts run before the orientation is messed up. But it may mess up other timings in your app, so best to edit MouseLook).

3d Scene Objects

What about if you don’t want to select only UI, but also 3d scene objects?

To do that add a PhysicsRaycaster to your camera (not the canvas, but directly onto your VR camera). Then results from our raycasts will include anything with a Collider.

Then for any object you want to interact with, add an Event Trigger component. By default these are based on PointerClick base which is exactly what we need. Now, simply press the “Add New Event Type” button and add any event you want such as PointerClick (for the click event), or maybe PointerEnter and PointerExit to handle the game object becoming hilighted.

Here’s mine on a cube:

gameobjectevents

 

Crosshair

crosshair

Lets add a crosshair to the above. There is the awesome YouTube video Oculus Rift DK2 – Unity Tutorial: Reticle that goes into full details about how to create a VR crosshair.

I’ve based mine on a couple of lines of code posted by cybereality in the Oculus forums. The key part is to maintain correct stereoscopic depth and move the crosshair to the same distance as the object being looked at, but crucially keep the crosshair looking the same size to the player. This is handled by these lines:

    void SetCrossHairAtDistance(float dist)
    {
        // Move the crosshair forward to the new distance
        Vector3 position = Crosshair.localPosition;
        Crosshair.localPosition = new Vector3 (position.x, position.y, dist);
        // But keep the crosshair the same perceptable size no matter the distance
        Crosshair.localScale = CrosshairOriginalScale * dist;
    }

Adding the crosshair image can be achieved using standard Unity techniques: Simply create a game object with a 3d Quad as a child of the VR camera and position it out in front of the user. Attach a material with the crosshair image (using a transparent unlit shader).

I’ve created a GazeInputModuleCrosshair script that simply looks at our GazeInputModule script to determine if we’re looking at something, and if we are use that distance to move our crosshair.

I also added a static DisplayCrosshair property so any script can easily show or hide the crosshair as needed.

In the video above, for the first half I don’t show the crosshair, then I turn it on my pressing the “C” key.

 

GazeInputModule.cs:

// Gaze Input Module by Peter Koch <peterept@gmail.com>
using UnityEngine;
using UnityEngine.EventSystems;
using System.Collections.Generic;

// To use:
// 1. Drag onto your EventSystem game object.
// 2. Disable any other Input Modules (eg: StandaloneInputModule & TouchInputModule) as they will fight over selections.
// 3. Make sure your Canvas is in world space and has a GraphicRaycaster (should by default).
// 4. If you have multiple cameras then make sure to drag your VR (center eye) camera into the canvas.
public class GazeInputModule : PointerInputModule 
{
    public enum Mode { Click = 0, Gaze };
    public Mode mode;

    [Header("Click Settings")]
    public string ClickInputName = "Submit";
    [Header("Gaze Settings")]
    public float GazeTimeInSeconds = 2f;

    public RaycastResult CurrentRaycast;

    private PointerEventData pointerEventData;
    private GameObject currentLookAtHandler;
    private float currentLookAtHandlerClickTime;

    public override void Process()
    { 
        HandleLook();
        HandleSelection();
    }

    void HandleLook()
    {
        if (pointerEventData == null)
        {
            pointerEventData = new PointerEventData(eventSystem);
        }
        // fake a pointer always being at the center of the screen
        pointerEventData.position = new Vector2(Screen.width/2, Screen.height/2);
        pointerEventData.delta = Vector2.zero;
        List<RaycastResult> raycastResults = new List<RaycastResult>();
        eventSystem.RaycastAll(pointerEventData, raycastResults);
        CurrentRaycast = pointerEventData.pointerCurrentRaycast = FindFirstRaycast(raycastResults);
        ProcessMove(pointerEventData);
    }
    
    void HandleSelection()
    {
        if (pointerEventData.pointerEnter != null)
        {
            // if the ui receiver has changed, reset the gaze delay timer
            GameObject handler = ExecuteEvents.GetEventHandler<IPointerClickHandler>(pointerEventData.pointerEnter);
            if (currentLookAtHandler != handler)
            {
                currentLookAtHandler = handler;
                currentLookAtHandlerClickTime = Time.realtimeSinceStartup + GazeTimeInSeconds;
            }
            
            // if we have a handler and it's time to click, do it now
            if (currentLookAtHandler != null && 
                (mode == Mode.Gaze && Time.realtimeSinceStartup > currentLookAtHandlerClickTime) || 
                (mode == Mode.Click && Input.GetButtonDown(ClickInputName)))
            {
                ExecuteEvents.ExecuteHierarchy(currentLookAtHandler, pointerEventData, ExecuteEvents.pointerClickHandler);
                currentLookAtHandlerClickTime = float.MaxValue;
            }
        }
        else
        {
            currentLookAtHandler = null;
        }
    }


}
Download GazeInputModule.cs Download GazeInputModuleCrosshair.cs Download - Demo GazeInput.unitypackage