What it took to achieveHandpresence in UVRF

While the idea of using motion controllers for VR has been around for some time, and even in DK1 days people were experimenting with Razer Hydra controllers, ever since the release of Oculus Touch, we expect to see virtual representation of our hand (and we expect those to behave just as our real hands). We at iNFINITE Production decided to have a look at the problem and went on to create a multi-platform toolkit, called UVRF (Universal VR Framework), for hand-presence in UE4. Here are some things that we learned along the road.

But, why?

Before we go into what it took to build above-mentioned kit, let’s answer another question — Why do we care about it in the first place? When it comes to VR, we’re seeing a lot of interest in development from people that often have little to no prior experience in that field. And while Unreal (and Unity as well) make development more accessible than it was ever before, it’s not for everyone. Enthusiasts may sometimes be stumped and discouraged, by technical hardships of development, and struggle to bring their innovative ideas to life. We believe that by making VR development more accessible we can help accelerate the growth of VR. Our release is simply an attempt to make newcomer’s life easier.

Pay attention to your hand

Most of the actions we do with our hands are automated, so we don’t even think about them. We perform actions like grabbing an object, flipping a switch or driving a car, without paying too much attention to the movement itself. But in order to understand how our hands interact with the world, we need need to start paying attention. To begin with, I simply observed how I, and other people as well, perform ordinary tasks . To make sure that the hand pose is natural, I adopted a habit of throwing fruit (mostly citrus, because we all know vitamin C is good for you) at my colleagues in the office, and later observing how they caught it. That allowed us to create interaction model, covering most cases. Of course, not all.

Big and small objects

Thumb and index fingers are our dominant fingers, ones we use the most. To hold bigger objects, we may use all five fingers, or use thumb and/or index finger for interaction. Think of a gun, index finger on a trigger and thumb to cock or release the trigger. With cellphones we mostly use our thumbs and same with lighter and i could go on. So we decided to create a basic grab, thumb and index finger actions. The bigger problem are the small objects (ironically enough)— we usually hold them just by thumb and index finger. But because Oculus Touch, HTC Vive or MS-MR have the grip button on middle finger, it creates a conflict. The brain of an experienced VR users, has already created a shortcut that equates middle finger button to a gripping motion (regardless of fingers used), so when presented with an object like fidget spinner, they will automatically press the middle finger button. On the other hand, new VR users, will very often squeeze the index finger and thumb against the controller. After some experiments with alternate thumb-index interaction for small objects we decided to bind the interaction to middle finger and follow model used for bigger objects. We wonder how this will be solved for upcoming Knuckles controllers, where the gripping motion is not bound to a button, while the index finger and thumb tracking is limited. We’ll take a look at it once/if we get get our hands on those.

Single vs multiple way to grab an object

Another problem we had to deal with along the way, was the multitude of ways in which you could grab a single object. While some objects could be picked up and held in many ways, even those, for the most part, had one way that was predominant. Things like PC mouse, fidget spinner, VR motion controller, fork, tea cup, kettle and so on, where the grip is based on inteded use. And while many objects can be moved freely on one axis (bottles for instance — can be held by the thick or the thin end), many objects can rotate freely (all round objects etc.), but all this would add more complexity to the toolkit. And since it’s intended for newcomers, we decided to go with just one way to grab an object.

Multi-object interaction

While the toolkit works fine multi-platform and is easy to use, the controllers themselves still exist in real world, so then we need to take that into cosideration if we want two objects to interact while being held, to make sure the controllers don’t smash into each other. This is an issue on any platform. A lot of times this can be solved simply by choosing a grab pose that would prevent the controlers smashing into one another, but that is not always possible. The safety pin on the hand granades is a nice example, the overlap while both object are grabbed is too large to allow for a realistic solution.

Proper hand position and rotation

Another thing that is critical to hand presence is to have it properly positioned and rotated in space, the correct hand position is of course different depending on the platform, with each controller having a different tracking point and different way in which it’s held. There is a simple solution however — 3D models of the controllers are usually provided with same pivot point as the tracking origin. So it’s easy to place the model on the exactly the same spot in VR and real life. With that, all that was left was to adjust the hand offset, to align the hand position in VR to the real life.

And while I would argue that this should be standard practice, many apparently don’t. Of the most recent releases, Microsoft’s “Halo” VR experience comes to mind, where the vertical position offset of the virtual hands is off by around 20cm. We had a feature to show controller in hand, but currently we removed this feature due to licensing conflict. We’ve reached out to Valve and Oculus for permission (so hopefully we’ll be able to add it in at a later date).

Testing

Testing of such an application is best done with people who’ve never experienced VR. Allow the to test the experience without telling them what each button does, i.e.: “Do things as you would in real life” — and see if they figure it out or struggle to do so. Our testing found that most had no problems orientating themselves in terms of controls.

What’s next?

As I mentioned at the begining, our main focus is to help VR newcomers, small to one-man-army dev teams, bring their projects to life. I’m surprised that companies like Oculus, Valve or HTC haven’t built something like that (and to my knowledge there is nothing similar publicly available for Unreal Engine 4). So we, over at iNFINITE, decided to fill the void, build the template and share it in a way that permits creativity — under CC0 license so people don’t have to worry about licensing and can focus on their projects instead.

We hope that fellow developers will find it useful (and cosidering the initial reaction it seems like it) and follow the suit when it comes to updates and new features. Since it’s free, we’re not making any money on it, but if it helps VR grow little bit, it was worth our time.
Back to Top
Close Zoom