RFC: Motion/body controller mapping to skeletal mesh for VR

I work in an academic lab where we research the psychology/cogSci of VR. In the coming years we expect to do lots of work involving body tracking and novel input formats for VR. Although I’m new to proper UE development, so far I’ve had really positive experiences with UE4’s tools and the C++ coding, full source and Oculus support make it really appealing to use as a base for experiments.

However, for it to work there is a need to integrate motion/body tracking controllers into UE4 and I’m looking for feedback about my planned approach. To state the basic basic problem: we must map the movements of external input devices to drive a skeletal mesh. During the course of our work we will certainly iterate through different input devices (think Kinect, PrioVR, custom sensors, etc) and experiment with the way those movements are represented and interact with the avatar. Hopefully these devices and interactions can share some common implementation details.

The following block diagram is what I’ve planned so far:

The first step will be some C++ hardware API involving the capture of input updates and any low level post-processing in C++ code. It seems this is what engine plug-ins were made to do. One thing I haven’t looked into is the threading model in UE4. Often in our homebrew apps we split the hardware on a dedicated thread that samples as high as possible. We can then do any smoothing/predicting/processing across samples if desired. Post processing may also involve coordinate space conversions depending on axis/rotation definitions of the device vs UE4.

The processed controller input then needs to get to someone who cares (APlayerController) and from what I can tell this is the role of the UPlayerInput class. Digging through it there are methods for mouse, key, axis and touch input, but nothing quite suited to our needs. My intuition is to derive a subclass (or eventually modify UPlayerInput directly?) to add our new input format. Most devices we’ve been exposed to dump a list of quaternions out, so it seems a good mapping for this would be a TArray of FRotator’s. The deeper mechanics of how a H/W plugin actually sends to UPlayerInput is something I haven’t dug into just yet, but I imagine its trivial.

(As the diagram suggests for future work it would be nice to have gesture/posture/motion recognition to map body movements to commands instead of direct skelmesh movement. A hybrid approach where some motions are directly mapped to the avatar while others get converted to locomotion/interactions/etc seems desirable, but I’m happy to get the basic case working first)

The PlayerController doesn’t need substantial change, except we want to expose the array of FRotators to the blueprint system. Again a simple subclass with an exposed TArray of Rotators will hopefully be the only changes needed (maybe gesture related utility in the future). I’d like to go through the blueprint system for a number of reasons. First, each hardware controller will likely expose a different number of bones and some may only be a subset of an actual skeleton (i.e. only above the waist). Certainly almost all will present fewer bones than the actual skeleton driving the mesh (e.g. fewer spine bones, no fingers). Rather than trying to write hard mappings in C++ to take a specific controller to a specific skeleton the blueprints seem ideal. Any hardware can be paired with any skeleton in a flexible way and the tuning of things like slerp’ing for the spine bones, feathering, filtering and so on can be iterated over more quickly. Secondly, accessing the skeleton through blueprints exposes the fine tuning and experimentation to artists/designers (or in my case grad students) without them needing to learn to code or deal with the recompile cycle.

As a small technical note, I have been talking about this exposure as a TArray of FRotators when a TMap of FRotators containing a {bone name, Rotator} pair is ideal. Unfortunately TMap doesn’t directly support replication, which is something we want. Telepresence is a very exciting aspect of VR so allowing motion replication is vital. Perhaps its worth just biting the bullet and writing our own serializer. Maybe this isn’t hard, I haven’t really dug into the mechanics of replication yet.

After that there shouldn’t be need for code changes anywhere further down the chain (I hope!). It seems that we can cast to the new AMotionPlayerController class in an animation event blueprint to pull out the rotators and then use the existing anim blueprint nodes for setting bones. This may involve a lot of manual mapping, if there is a better strategy for looping/translating the positioning of a partial/lower LOD skeleton to a higher one please let me know.

So thats my brain dump so far. Comments, suggestions, etc all very welcome especially on the general strategy. I’m new to both UE and skeletal based animation so I’m sure the more experience community members can offer some advice. Hopefully at least the heuristics, if not the actual implementation, can be something that can be reused and extended by the community at large.