R34 Airship Project - Week 4 - Testing Hand Tracking

This week I started work on creating a VR story experience that could be shown at the R34 Event.

The past few weeks, I spent my time looking at AR, I feel like I now have a good idea of the possibilities available, as well as the knowledge of how I would complete them.

Because we're able to have some space for our project at the event we decided that we would create a VR experience for people to use. There are lots of interesting story's about the R34 Airship that I know I personally enjoyed reading about, and the others in the team also enjoyed them so we decided on a VR storytelling experience.

The user will be able to select a story, the game would then show the story in VR with elements of it happening all around the player. We also want to have intuitive control in out game, we dont want to go with just a console controller like some VR games, so we have decided to use Leap Motion hand tracking as well as the VR headset so that the user can see their hands in the game and interact with the wold around them.

I started by looking up how you would set up Leap Motion in Unity, and it turns out that they have an asset pack that includes everything you need to set up leap motion and use it in your Unity games. The asset pack includes a scene with the headset and leap motion controls already set up so that people can focus more on making their games rather than trying to make the equipment work.

Once I had the hardware working with Unity, I decided to start looking that the physical limitations of the leap motion, so that I have a better idea of what is and isn't possible. The field of view of the hand tracking is very wide, and the sensor attaches to the front of the headset so as long as your hands are in front of the headset, you're not going to notice your hands being completely lost from the scene.

One limitation that I noticed is that the sensor can only see so much, as its one sensor looking from one direction, this means that in certain positions, the sensor cannot see your fingers, the software does a good job of predicting a lot of the time however sometimes it can look like your hand is glitching around even though your hand is completely still. 


The asset pack also comes with a good tool that allows you to visualise exactly what the sensor can see. this is helpful because it reduces the number of problems you don't know about in the future. one thing that I found interesting is that the mug in that gif actually has a design on it, however, because the sensor uses infrared, it doesn't see the pattern, this means that if a player is wearing some gloves, or they have something on their hands such as paint, there is a possibility that the sensor has great difficulty tracking that hand.

I briefly looked into interaction with objects in the world this week. By default, the hands in the software can interact with any rigid body in unity, however, it is not the best experience, the blocks don't do exactly what you expect they would, sometimes being thrown into the distance when touched, to fix this the asset pack comes with some scripts that allow for better interaction with any rigid body with the script attached. this allows you to pick up, move or grab objects with your hands.

Not much was done in terms of progress in the project this week, however, I feel like my understanding of the process and limitations of hand tracking and VR have grown massively.

Comments