R34 Airship Project - Week 7 - Testing Interaction Methods

This week I looked at the different methods of interacting with UI and the world around the player. I looked into simply having the player look at things, touch objects or buttons using the hand tracking and having the player point at things to interact.

Touch Interaction

I found that the easiest method to implement was having the player touch a button, all I had to do was attach a collider to the hand models so that if they collide with colliders on the buttons, it triggers an event. If I had decided to go with the hand based UI, this could have been a good method, however, because I am going with a UI anchored in 3d space in the world, its not certain that the UI would be within reach of the player, especially because this will be a seated game and the player will not be able to walk towards the buttons.





Looking Based Interaction

The look based interaction was easy to code also, I simply made it so that there is always a ray being cast from the centre of the camera, if that ray then hits a button, it would trigger the event linked to that button. This is good because no matter where you are, you will always be able to interact with the buttons.

A problem I found with this, however, was that when I was moving my head around trying to look at different buttons, I would accidentally look at a button I didn't want to trigger. to fix this I added a timer that would activate when the player looks at a button, if the player looks away from the button, it will reset the timer and the event will not be triggered, I found that it became quite annoying when I was simply trying to look around and it kept starting timers for button interactions.


Pointing Interaction

The 3rd interaction option was the best of both worlds, in this method the user must point at the button they want to trigger. This was good because it means the player is able to trigger the button no matter how far away they are, it also fixes the problem with the look based interaction where buttons will be triggered unintentionally because the player has to intentionally point their finger at the button they want to interact with. This also utilises the timer mechanic from the looking method, to make sure that interactions are intentional.

To make the pointing interaction was harder than the rest, I had to raycast from the centre of the camera, and have it cast towards the tip of the hand model, the ray is only cast when the player has extended their index finger, this not only stops unintentional button interactions but also improves performance slightly because the computer is not constantly raycasting when not needed.

What will I be using?

For this project, I have decided to use the more advanced pointing interaction as it is the most user-friendly, as well as the most accurate.

Comments