As VR developers we almost all need a UI system where the user can look at something and action it, such as a button.
With Unity’s new UGUI (in 4.6 and 5) it finally makes it easy to build great world-space UI for input by mouse. But it doesn’t have anything out of the box for VR because it does not support a look-based input method (look at and object and press a key/button to select), or a gaze-based input method (auto-select after gazing for a period of time). It is left up to VR developers to keep re-inventing the wheel.
There was this great thread New Unity UI + OVR Look-Based Input HOWTO on the Oculus forums that provides an approach to do look tracking with UGUI. It has evolved over the last 6 months to work with more of the built-in UI elements. It enables you to look at an element and select it by a key or button. Unity has published the full source code to their UGUI input system, making it possible to understand how the input modules work.
What I wanted was a small and simple system that would do both look and gaze input and be very quick and easy to add to a project. So I created the GazeInputModule script below. The module can work as either gaze or look based. It leverages off the built-in Pointer Input Module which already has the code for tracking enter/exit on UI.
The way UGUI input works is that you must create an Event System and then attach one or more input modules to it. Unity creates you a game object with an Event System automatically if you add any UI to your scene in the editor. By default Unity creates you a Standalone-Input-Module and a Touch-Input-Module. My new input module should *replace* the existing ones, as we can’t have both the mouse and head tracking the same elements (because which should be active if both the mouse and head are pointing at different elements?).
Our script will use raycasting, for which you need to make sure your Canvas is in world-space, and it has a Graphic Raycaster attached (A canvas created by the editor automatically adds one). (If you have multiple cameras you would also need to drag your VR Camera (or Center Eye Camera) onto your canvas so it knows which camera to raycast from:
Drag on the Gaze-Input-Module script onto your Event System game object and disable the existing input modules:
Finally choose Click or Gaze methods.
That’s it! Hit Run.
For Click, look at a button and press theXBox Controller “A” key or press the “Spacebar” on your keyboard.
For Gaze, look at a button for 2 seconds.
To summarize:
1. Drag it onto your EventSystem object.
2. Disable any other Input Modules (eg: StandaloneInputModule & TouchInputModule) as they will fight over selections.
3. Make sure your Canvas is in world space and has a GraphicRaycaster (A canvas created in the Editor gets one automatically).
4. If you have multiple cameras then make sure to drag your center camera into the canvas.
Unity 5 First Person Controller Special Warning:
If you are using the FirstPersonController be aware that it totally breaks Canvas UI ! This is because the mouse code in it resets the camera orientation every frame during Update() for both mouse rotation horizotnal and vertical, and also head bobbing effect.
To fix it: Firstly turn off head bob property in the inspector. Then either disable the mouse code completely in FirstPersonController (comment out RotateView() call in Update()), or alternatively to keep mouse to turn your body, edit the MouseLook.cs file and comment out the 2 places it calls camera.localRotation in LookRotation() file. (Another alternative for a super quick fix is to rename Update() to be LateUpdate() so the Canvas UI raycasts run before the orientation is messed up. But it may mess up other timings in your app, so best to edit MouseLook).
3d Scene Objects
What about if you don’t want to select only UI, but also 3d scene objects?
To do that add a PhysicsRaycaster to your camera (not the canvas, but directly onto your VR camera). Then results from our raycasts will include anything with a Collider.
Then for any object you want to interact with, add an Event Trigger component. By default these are based on PointerClick base which is exactly what we need. Now, simply press the “Add New Event Type” button and add any event you want such as PointerClick (for the click event), or maybe PointerEnter and PointerExit to handle the game object becoming hilighted.
Here’s mine on a cube:
Crosshair
Lets add a crosshair to the above. There is the awesome YouTube video Oculus Rift DK2 – Unity Tutorial: Reticle that goes into full details about how to create a VR crosshair.
I’ve based mine on a couple of lines of code posted by cybereality in the Oculus forums. The key part is to maintain correct stereoscopic depth and move the crosshair to the same distance as the object being looked at, but crucially keep the crosshair looking the same size to the player. This is handled by these lines:
void SetCrossHairAtDistance(float dist) { // Move the crosshair forward to the new distance Vector3 position = Crosshair.localPosition; Crosshair.localPosition = new Vector3 (position.x, position.y, dist); // But keep the crosshair the same perceptable size no matter the distance Crosshair.localScale = CrosshairOriginalScale * dist; }
Adding the crosshair image can be achieved using standard Unity techniques: Simply create a game object with a 3d Quad as a child of the VR camera and position it out in front of the user. Attach a material with the crosshair image (using a transparent unlit shader).
I’ve created a GazeInputModuleCrosshair script that simply looks at our GazeInputModule script to determine if we’re looking at something, and if we are use that distance to move our crosshair.
I also added a static DisplayCrosshair property so any script can easily show or hide the crosshair as needed.
In the video above, for the first half I don’t show the crosshair, then I turn it on my pressing the “C” key.
GazeInputModule.cs:
// Gaze Input Module by Peter Koch <peterept@gmail.com> using UnityEngine; using UnityEngine.EventSystems; using System.Collections.Generic; // To use: // 1. Drag onto your EventSystem game object. // 2. Disable any other Input Modules (eg: StandaloneInputModule & TouchInputModule) as they will fight over selections. // 3. Make sure your Canvas is in world space and has a GraphicRaycaster (should by default). // 4. If you have multiple cameras then make sure to drag your VR (center eye) camera into the canvas. public class GazeInputModule : PointerInputModule { public enum Mode { Click = 0, Gaze }; public Mode mode; [Header("Click Settings")] public string ClickInputName = "Submit"; [Header("Gaze Settings")] public float GazeTimeInSeconds = 2f; public RaycastResult CurrentRaycast; private PointerEventData pointerEventData; private GameObject currentLookAtHandler; private float currentLookAtHandlerClickTime; public override void Process() { HandleLook(); HandleSelection(); } void HandleLook() { if (pointerEventData == null) { pointerEventData = new PointerEventData(eventSystem); } // fake a pointer always being at the center of the screen pointerEventData.position = new Vector2(Screen.width/2, Screen.height/2); pointerEventData.delta = Vector2.zero; List<RaycastResult> raycastResults = new List<RaycastResult>(); eventSystem.RaycastAll(pointerEventData, raycastResults); CurrentRaycast = pointerEventData.pointerCurrentRaycast = FindFirstRaycast(raycastResults); ProcessMove(pointerEventData); } void HandleSelection() { if (pointerEventData.pointerEnter != null) { // if the ui receiver has changed, reset the gaze delay timer GameObject handler = ExecuteEvents.GetEventHandler<IPointerClickHandler>(pointerEventData.pointerEnter); if (currentLookAtHandler != handler) { currentLookAtHandler = handler; currentLookAtHandlerClickTime = Time.realtimeSinceStartup + GazeTimeInSeconds; } // if we have a handler and it's time to click, do it now if (currentLookAtHandler != null && (mode == Mode.Gaze && Time.realtimeSinceStartup > currentLookAtHandlerClickTime) || (mode == Mode.Click && Input.GetButtonDown(ClickInputName))) { ExecuteEvents.ExecuteHierarchy(currentLookAtHandler, pointerEventData, ExecuteEvents.pointerClickHandler); currentLookAtHandlerClickTime = float.MaxValue; } } else { currentLookAtHandler = null; } } }Download GazeInputModule.cs Download GazeInputModuleCrosshair.cs Download - Demo GazeInput.unitypackage
August 19, 2015 at 9:57 pm
Hello Sir,
I tried the attached unity package file. Opened in unity 5 , added a quad object and gave a cursor material.
Have placed a camera in the centereyeanchor and used that camera in the canvas.
When i try to run in unity i did not get any errors but there is no cursor appearing in the scene what i gave as texture and using the mousecursor when i click on any of the buttons i do not get any output.
Could you please help me out.
thanks
August 20, 2015 at 12:16 pm
It appears to be because you are using the Oculus OVRPlayerController, whilst I used Unity native VR.
I do see everything works in terms of looking and selecting, except the crosshair is not aligned with where the user is looking and is at the wrong distance.
I’ll investigate this!
August 21, 2015 at 9:56 am
So false alarm, everything above also works fine with OVRPlayerController.
Murali’s issue was the crosshair was not aligned (set to 0,0,2) in world space in front of the camera, and also that the cross hair used the wrong shader (his image did not contain alpha, so Unlit/Transparent shader would do clipping and z incorrectly – his crosshair fix is to use Unlit/Transparent Cutout.
Good luck!
August 31, 2015 at 6:42 pm
Dear Peter,
Thank you for this useful tutorial. I am also working on a VR project, the client wants a warped UI and I have been successfully extending this example code but have hit a snag (see the last post by ‘isidro02139′ a.k.a. me!):
http://forum.unity3d.com/threads/warp-bend-the-ui.283066/
Wondering if you have any insight, I think one problem might be that I am making the same call as you do above:
pointerEventData.position = new Vector2(Screen.width/2, Screen.height/2);
..to get the center of the screen, but what I really want is to build pointer event data in local screen space of the world canvas to which my “MappedGraphicRaycaster” is attached. Here is my reticle class:
https://gist.github.com/arun02139/227361b483a0315ff57c
Thanks in advance for any advice,
Arun
October 3, 2015 at 2:28 pm
So sorry Arun I missed your post (in a pile of spam). Did you work it out in the end? I love the look of your curved UI
September 3, 2015 at 6:34 pm
Hello. Thanks for post.
I have a problem using your script with Google Cardboard. I added script to EventSystem object in the scene and tisable Touch and Standalone modules. After that i need to return from cardboard mode to simple camera with Touch module enabled. I disable Gaze script and enable Touch module. But after that touch begins to work only with second finger. I Mean if i want to push the button i need to touch the screen by one finger and while touching push the button by other finger. Do you have any ideas why is that and how to solve the problem. I tried to delete Gaze script, update modules, load different scene but nothing helped.
Thanks.
October 3, 2015 at 2:27 pm
There is some inter-connectedness between the gaze and touch input modules because they are both using the same event system. If you are going to use both touch and gaze, possibly you may need to create separate event systems (eg: 2 game objects) and enable disable them as needed. Let me know if that works for you!
September 11, 2015 at 8:47 pm
Hi!
I’m trying to write in an InputField control after looking at it and clicking (or waiting 2 seconds) but I haven’t achieved it yet. Do you know how to do it? When I press the space bar, the cursor appears in a weird place inside the canvas and the InputField control remains hightlighted. Letters don’t appear anywhere.
October 3, 2015 at 10:16 am
Did you work it out? Download the sample unitypackage at the bottom of the article.
January 14, 2016 at 3:21 am
First of all, thank you very much for your work; you really helped me. This being said, I am having the same problem. I already downloaded the demo and whenever I tried to write in the InputField (after selecting it, which worked fine) it didn’t write what I pressed on the keyboard.
September 16, 2015 at 3:17 pm
Hi Peter Koch,
First of all, Thanks a lot for making this tutorial. It was so nice and useful when I was really blinking and clueless of how to implement this look based input (Eye-Picking).
I am developing a VR app for Samsung GearVR. I just want to clarify whether this script and concept will work on GearVR too?? As this is also based on oculus. So, I guess this will work on GearVR too.
I request you to share some latest tuts/samples on this if there’s any,
thanks
Jeeva
September 16, 2015 at 8:51 pm
Hi Jeeva. Yes, it works perfectly fine in GearVR. I use it in both DK2 and GearVR.
September 18, 2015 at 2:28 pm
Thank you Peter for the clarification!… Means a lot
October 3, 2015 at 10:16 am
Hi Jeeva. Yes it all works on GearVR too. If you aren’t using the timer based input, you need to map a button press of the GearVR to be the click – you can use any click on the touch pad – or the ESC key maps to the physical back button.
September 17, 2015 at 7:45 pm
Hi Peter,
I’m having trouble with your GazeInputModuleCrosshair script, when I add the script to my project I get the following error,
“Assets/PeterKoch/GazeInputModuleCrosshair.cs(51,60): error CS1061: Type `GazeInputModule’ does not contain a definition for `CurrentRaycast’ and no extension method `CurrentRaycast’ of type `GazeInputModule’ could be found (are you missing a using directive or an assembly reference?)”
Could this possibly be due to me currently still having to use Oculus 0.4.4 or have I missed something obvious?
Thanks
September 17, 2015 at 8:29 pm
….false alarm
I copy pasted the GazeInputModule.cs code from the website instead of using the download version, It all appears to be working now.
September 19, 2015 at 1:28 pm
Hi Peter,
I just wanted to say thank you so much for putting this tutorial together and providing the GazeInputModule. I’m attempting to create VR games for Oculus Rift in Unity as a hobby and am really new to game development. I’ve been struggling to determine how people are making the gaze menus and your tutorial and script is absolutely perfect!
Best Regards,
Jim Baranoski
September 19, 2015 at 9:47 pm
Thanks for your comment Jim. With Unity a key player in VR I’m still surprised they haven’t added in a look based input module, or a world-space keyboard!
September 22, 2015 at 4:02 am
Good stuff Peter.
We already have a VR input module in the UI Extensions project, but I’d like to merge it with yours to give the best possible component the community can use (all scripts are added with author attributions)
https://bitbucket.org/ddreaper/unity-ui-extensions
One big comment though is that I see your cursor script is using EventTriggers, as an FYI, it is not a good idea to recommend these as they are overly heavy, better to implement the interface handlers for the events you want to publish.
September 28, 2015 at 6:06 pm
Hi Simon. Your unity-ui-extensions project is fantastic. It is great to see code brought together into a common library. I’ll create some pull requests with some suggested code merges.
Regarding to EventTriggers – that’s just in the example – what I am demonstrating there is that I didn’t break any of the ways they might process input events. They can also use interface handlers. Creating a cube and adding the script below works great:
using UnityEngine;
using UnityEngine.EventSystems;
public class ButtonClickHandler : MonoBehaviour, IPointerClickHandler().material.color = new Color (Random.value, Random.value, Random.value, 1.0f);
{
public void OnPointerClick(PointerEventData eventData)
{
// Set to a random color
GetComponent
}
}
September 29, 2015 at 7:13 pm
Hi Peter,
Brilliant Tutorial.. Its been very helpful for me to develop my own projects with this Gaze at script.
If you dont mind I’m struggling with 2 things though.
1 – On the example scene you supplied – the crosshair dissapears if it is not hovering on / very close toe the 4 buttons you have.
– How would i make it so the crosshair is visible all of the time no matter where you look?
2 – When the crosshair is looking at a button.. How would be the best way for me to implement a countdown timer for the duration it takes the person to look at and select the button? Just a simple animated Icon or something would be great and what would be the best format for this?
Thanks very much.. your tutorial has been very helpful so far.
Benjamin
September 29, 2015 at 11:24 pm
Hi there
Ive been re-doing this process a few times…
Just got the crosshair working ( doesnt dissapear when im off the button and is there all the time so thats good )
Just wondering on how one may just incorporate a countdown timer into the Crosshair now just as visual feedback to the user that something will happen in 2-3 seconds ( when the Gaze timer gets to 0 )
thanks very much again for a great tutorial.
regards
Ben
thanks
October 3, 2015 at 7:57 am
That’s a great idea. We could use an animated sprite for the crosshair and make it fill up over the time period. I’ll have a look into that!!
October 21, 2015 at 6:15 am
Is it possible to load new scenes in those colourful squares of yours? If so, how?
October 21, 2015 at 11:11 pm
Hi Mark,
Simply add a method to a script (simplest is to put the script on the square itself)
void StartGame()
{
Application.LoadLevel(“Main”);
}
And using the inspector tell the click event to call it, or use the code a few comments up named ButtonClickHandler.
HTH
October 23, 2015 at 2:56 am
Hey Peter,
I’m able to load a new scene, but for some reason the UI stops functioning in that loaded scene. Do you know why that is?
October 30, 2015 at 6:33 pm
Hi Peter,
Thanks for this awesome article. I have been experimenting with the OVRPlayerController and the GazeInputModule, and once I add a Physics Raycaster to the CenterEyeAnchor and set up a Cube with IPointerClick/Enter/ExitHandler on which I implement OnPointerEnter and OnPointerExit to change the material, I get heavy flickering. When I add logging to GazeInputModule, I can see that there is a constant triggering in HandleSelection between pointerEventData.pointerEnter == null and it not being null, even though I am not moving my pointer. Any suggestions on this? Thanks in advance.
October 30, 2015 at 6:43 pm
Hi,
I figured it out, it is solved when I set the event mask of the Physics Raycaster to only a single layer. When I set it to everything, the flickering occurs. Would be nice to know why this is happening though
November 10, 2015 at 8:13 am
Hey there Peter,
First of all, thank you for sharing this 😉
Now I ‘m working on adapting your script to accomodate “gaze” sliders as well. I’ve checked the article on the Oculus forum and I’m wrapping my head around the event system.
However I’m having a bit of a hard time figuring out what could be the best approach to achieve what I need. At the moment I’m guessing that:
1-I need to separate click from drag (events/handlers) on the script
2-I need somehow to get the handle GO’s on the sliders
3-Only then I can set movement (either from gaze or from a controller)
Is this a good approach? Can you give me some pointers on how to achieve this using your work as a base?
Thanks in advance for the help!
November 11, 2015 at 12:07 am
Hi. I kept mine as simple as possible to achieve clicks only, which means it isn’t a drop-in solution for more advanced UI elements that need more then click – such as sliders. The key is to make sure the UI controls get all the events to perform IDragHandler. The latest Google Cardboard Unity SDK has a pretty sophisticated Gaze Input Module, it may have the answers for you: https://github.com/googlesamples/cardboard-unity/blob/master/Cardboard/Scripts/GazeInputModule.cs
November 19, 2015 at 9:14 pm
Hi Peter,
First of all, thanks for sharing your solution.
I’m dealing with Unity 5.2.0f3 and Oculus DK2. I’ve downloaded your Demo GazeInput.unitypackage. Everything seems to run well besides the action that should be run after a fixed time of gazing (for example your function MyClick after a button is selected). According to you script ButtonListener.cs I should see a selected button text in Console. Unfortunately, I can’t. All associations functions with GUI are correct. While debugging the function MyClick is not run.
In fact, I have no idea what’s wrong.
Could you identify possible problems.
Regards
bulbka
November 19, 2015 at 9:37 pm
Hi again!
Please forgive me my earlier post. Everything works perfectly!
I tried to run it before well reading your article. Shame on me!
If Click option is set I should press a spacebar if Gaze option is set everything works perfect.
Onece again thaks for this solution
Regards
bulbka
January 5, 2016 at 12:19 am
Hi,
Iam using the gazeinput module in my project and this works fine. I use a game pad in order to navigate.
I use oculus utilities and OVRPlayercontroller.
Now when I gaze at an object iam able to get the selection. How can I make the OVRPlayercontroller navigate towards that object without the gamepad interaction.
Can you please clarify.
Thanks
January 24, 2016 at 1:16 am
Hello Peter!
Thank you for sharing this.
Did you ever try this with the Gear VR? Unfortunately I can’t seem to get your demo scene working with it.
Do you know any solutions?
Kind regards,
Annika
January 25, 2016 at 11:12 am
Hi Annika – yes it does work on GearVR. There have been some quirks because it doesn’t fully support all event paths, so sometimes when UI is next to each other or overlapping it may not register a click. What version of Unity are you running? This was tested on Unity 5.2
February 2, 2016 at 8:44 pm
Dear Peter,
Thank you for this great tutorial. I just began to study Unity VR.
I try your GazeInput unitypackage in the 3Glasses* device. Gaze input of buttons is work but cubes can’t be triggered.
I don’t understand how to drag multiple VR cameras onto Cavas.
Can you give me some pointers on how to achieve this using your work as a base?
Thanks very much.
* 3Glaeese: http://www.3glasses.cn/developer.action
Best regards,
Nicolas
February 5, 2016 at 9:35 am
Hi Nicolas. If the cubes are not triggered, check to make sure you have PhysicsRaycaster on your camera. Let me know how that goes. Peter
March 18, 2016 at 7:24 am
Hi Peter,
I’m attempting to use this tutorial with the gear VR on a samsung s6 edge.
When the input is in gaze mode, it works perfectly. However, when I try it in click mode with the gear VR’s touchpad, it doesn’t respond. The touchpad has been working when I use Input.GetMouseButtonUp(0) in the Update function, so I can’t seem to figure out why it doesn’t trigger the onClick functions for the buttons.
Any advice you have would be greatly appreciated!
March 20, 2016 at 9:11 am
Hi Brian. Have you disabled the default input managers? Or at least made them a lower priority then the GazeInputManager?
April 25, 2016 at 6:56 pm
Hi Peter, I have followed your steps, but when I look into a button, nothing really changes. What to do? Can you kindly leave a reply soon please?
April 27, 2016 at 7:47 am
Hi, Is it a 3D object or a Canvas UI? You need to make sure the correct event system and raycaster is in your scene. Feel free to email me a test scene from your project.
May 15, 2016 at 9:19 am
Hey peter I have a question. How do I add a script that is able to add the B button in xbox controller to act as a back button? Thanks Alot
June 2, 2016 at 10:03 pm
Hi Christian. You can assign the B button (joystick button 17) in the Input Manager and name it “Back” then you can check for that being pressed: if (Input.GetButtonDown(“Back”)) { … }
May 21, 2016 at 2:26 am
Hi Peter. Have you tried this by any chance:
https://developer.oculus.com/blog/unitys-ui-system-in-vr/
When I recreate the gaze pointer using their project file, everything works as it should. However, when I use a blank project file and set up the same scene, the gaze pointer vanishes behind gazable objects. I’ve been trying to figure out what is happening to no avail.
here’s my test scene:
https://www.dropbox.com/sh/jhd8tqs2cv74up3/AABCs57QnGkzryMk8hC9ve4Ta?dl=0
Thanks for your time.
May 24, 2016 at 10:36 pm
Hi Marc. That runs perfectly in Unity 5.3.4p1. The problem you are seeing must be an issue with Unity 5.4 beta!
May 25, 2016 at 2:08 am
Hi, thanks for your reply. Someone from oculus responded and suggested a fix. Posting it here in case that helps anyone:
Thanks for finding this. It’s a bug in the shader code. You can fix it by finding “Assets\Shaders\Particle Alpha Blend-AfterTrans.shader” and deleting the following lines:
#ifdef SOFTPARTICLES_ON
o.projPos = ComputeScreenPos (o.vertex);
COMPUTE_EYEDEPTH(o.projPos.z);
#endif
and:
#ifdef SOFTPARTICLES_ON
float sceneZ = LinearEyeDepth (SAMPLE_DEPTH_TEXTURE_PROJ(_CameraDepthTexture, UNITY_PROJ_COORD(i.projPos)));
float partZ = i.projPos.z;
float fade = saturate (_InvFade * (sceneZ-partZ));
i.color.a *= fade;
#endif
June 1, 2016 at 3:58 pm
Hi,
How to work gaze for slider?
June 2, 2016 at 9:58 pm
Hi, you’ll need to use a more advanced input method for sliders. My input is really for buttons only!
June 15, 2016 at 7:08 am
Hi Peter, I am developing Gear VR games as a hobby.
I implemented your Gear VR touchpad controls and the gaze input, it works like a charm!! Next I will install the crosshairs too. I saw you also have the Gazefuse option, absolutely brilliant! I’ll also use the screenfader when shifting scenes. I don’t know how to thank you, your the best! Keep up the good work!
O yeah, one last question, is it possible to get the crosshair to only show up when looking at/near an object collider / button etc?
Thanks for your time.
June 17, 2016 at 2:35 am
Dear Peter,
Everything works like a charm, brillliant!! Thank you so very much!
One question, is there a possibility to transform the crosshair to the normal of the object? With RaycastHit.normal?
Regards.
July 7, 2016 at 5:23 am
Hey Peter, great tutorial, has helped me a lot on my project. I have a basic load level method after staring at a button on the UI Canvas; it works great when testing it on Unity, however when I build it for Gear VR it doesn’t work (The reticle is there but it doesn’t interact with the buttons). The fuze circle from your other tutorial isn’t filling also (on Gear). Any idea what could be causing this?
July 8, 2016 at 5:21 am
I had some issues at first, but after reading others comments realized that it didn’t work with the 5.4 beta, but did get it to successfully work and could customize it on 5.3. Thanks for the awesome tutorial/ documentation
July 25, 2016 at 9:13 am
Hey Peter,
Very well written post. I’m having trouble implementing this in unity’s 5.4.0b24. Not even the included demo works as shown in the video. (the buttons are not popping out, and the cubes are not being highlighted). I assume is a beta problem. Any help at all would be great.
Regards,
Oliver
July 25, 2016 at 1:01 pm
Whoops, duh
July 30, 2016 at 10:33 pm
Dear Peter,
I’ve tried what you described above and it works fine in 5.3. However in 5.4 it works partly, because there seems to be an offset. Because when I aim at the buttons, there is not interaction. However when I aim above the buttons, I see I interact with the buttons while not actually hover over them.
Could you clear out what exactly happens in Unity 5.4. I think they changed something with the PointerEventData, but while I read all change logs, I can’t find what they exactly changed.
Kind regards,
JDMulti
July 30, 2016 at 10:43 pm
Found the solution:
// fake a pointer always being at the center of the screen
pointerEventData.position = new Vector2(UnityEngine.VR.VRSettings.eyeTextureWidth/2, UnityEngine.VR.VRSettings.eyeTextureHeight / 2);
August 1, 2016 at 1:50 am
Hi there, you saved my day! I had missed your post before I posted about the same problem. Thanks for the fix!
August 11, 2016 at 2:41 am
Where did you put the above code?
August 11, 2016 at 2:52 am
Found it! It’s on the GazeInputModule script under HandleLook method. Thanks!
September 8, 2016 at 11:26 pm
Thanks for that, just ran into that problem and didn’t think about the new VR options. But better yet is to use so it works in the Editor as well:
#if UNITY_EDITOR
pointerEventData.position = new Vector2(Screen.width / 2, Screen.height / 2);
#else
pointerEventData.position = new Vector2(UnityEngine.VR.VRSettings.eyeTextureWidth / 2, UnityEngine.VR.VRSettings.eyeTextureHeight / 2);
#endif
July 30, 2016 at 11:34 pm
After upgrading to Unity 5.4 it looks like something is broken. The gaze input does not work anymore. Any suggestions to fix this?
August 1, 2016 at 2:07 am
Never mind, JDMulti’s fix is working!
September 14, 2017 at 12:43 am
Hi really cool article.
For those who would like to test using the FPS Controller a simple fix is posted
here http://answers.unity3d.com/questions/1262549/cursorlockstate-and-onpointerenter-not-working-tog.html
Essentially instead of messing around with the FPSController source code you add this function to the GazeInputModule.cs
protected override void ProcessMove (PointerEventData pointerEvent)
{
var targetGO = pointerEvent.pointerCurrentRaycast.gameObject;
HandlePointerExitAndEnter (pointerEvent, targetGO);
}
And if you also replace
(mode == Mode.Click && Input.GetButtonDown(ClickInputName)))
with
(mode == Mode.Click && Input.GetMouseButtonDown(0)))
in GazeIm[putModule then you have also mouseInput instead of GamePadInput.
May 22, 2018 at 12:50 am
Hello,
I can’t get the official package work in U 2017.3.1. Everything works in editor, but when I try GearVR, the input module is not working (regardless gaze/click). Thanks in advance!