Any tips to get started developing for Vive before hardware arrives?

I’ve got a project that I want to get developing on that will take advantage of the motion controllers of the Vive, but I won’t have my Vive for a month or so (never got ahold of a developer Pre edition). I’d like to get started on it to start building the framework so that once the Vive arrives, I’m pretty much ready to go to sync up with the motion controller positions.

The project mostly ends up needing to just interact with physical objects (grab, rotate, translate them, etc), and I’d like to be able to have multiple input device support so people could interact either via Leap Motion or just utilizing their mouse to move a grabber around in space (think Surgeon Simulator).

However without having a Vive, I don’t really know how to start laying out my input and interaction framework to work with it. If it’s just as simple as getting a position and rotation of the controller then that’s not much concern, but I figured I’d ask before I waste a lot of time developing down one path that ends up not actually being viable.

In addition to this, are there any other tips that current Vive developers can share with us who haven’t gotten ours yet? Any hiccups you ran into, things you wish you knew before you started, etc.

There’s a lot of new wrinkles which Vive introduces. Actually, maybe it’s not “vive” specific, but more general to developing for room scale vr experiences. We’re doing one of the huge challenging things for VR that most devs don’t do: full body avatars. That adds some challenges most other devs probably don’t face, but also gives us more design opportunities. Anyways, I’ll hit the wave tops here instead of getting deep into it.

  1. Players come in different heights. The Vive Z axis value is its height displacement from the floor, so different players are going to have different height displacements. You either have to keep this in consideration with your level design (lazy and bad approach) or you have to calibrate the HMD to account for the player height and set it to the eye height of your ingame avatar.

  2. The motion controllers have the same axis orientation for both hands. X is forward, Z is up, Y is right. If you are creating skelmeshes and using motion controllers to drive the rotation of a bone, you want to replace the bone rotation with the motion controller orientation, and your bone rotations for both hands should match the orientation of the motion controllers. If you’re using a full body avatar, you’ll want to use two bone IK to place the hand at the motion controller position, but also might want to consider the possibility of the player moving their hand through virtual obstacles in game. In reality, the object pushes back against your hand stopping it. VR objects don’t do that to our physical hands, but they could block avatar hands.

  3. Players can walk around a room and clip through blocking obstacles. Most VR demos don’t account for this. Is this a potential problem for your game?

  4. If you wire your leap motion events into game responses, integrating motion controller support is really easy: just drag an execution wire into the same event leap motion uses. You don’t actually need the vive to add in the motion controller support within UE4, though you won’t be able to test it.

  5. Locomotion is going to be a design challenge. I don’t know what your approach is going to be, but I am working on this in my game at the moment. most people use teleportation. I’m inventing something new, but don’t know if it’ll work yet.

Thankfully most of my project’s design will revolve around either standing in an open field, stationary, or involve just rotating on the spot. I may be cheating out of the more complex intricacies of room scale design :), but it does mean level design, locomotion and clipping are a nonissue for me.

I never actually thought about the headset being at different heights for different users. Interesting point though, at least when it comes to game balancing / collision systems.

I just got my leap motion in today so I’ll start fussing with that, but sounds good if it’s a similar execution path!

Awesome stuff, thanks!