Hi everyone! I’m new to XR development and recently started experimenting with VR in Unreal Engine (currently using UE 5.3 with the OpenXR framework). My goal is to create a basic VR experience where players can see and move their hands using hand tracking (no controllers, just real hands).
I’ve enabled hand tracking using the OpenXR plugin and can get some tracking data (like joint positions), but I’m having trouble getting my in-game skeletal mesh hands to align properly with the tracked data. The issues I’m facing:
- The hand mesh seems offset or rotated incorrectly compared to the real-world position.
- There’s noticeable jittering or delay in movement.
- I’m not sure how to properly map the joint data to the bone structure of the hand mesh.
I’ve looked into using the LiveLink
system and MotionControllerComponent
, but I’m a bit confused about which workflow is recommended for hand tracking specifically.
Has anyone implemented a working hand-tracking setup using OpenXR? Should I be retargeting animations or driving the bones manually via blueprints/C++?
Would really appreciate any tips, example setups, or even common mistakes to avoid. Thanks in advance for any help!
I’ve only used the handtracking system with Index controller finger tracking, but the OpenXR extension should expose it the same way as full hand tracking. This was a few years ago with UE 5.0, but the hand tracking system doesn’t seem to have had much changes since then. The hand tracking systems provided with the engine are very bare-bones.
The way I got it working was using a MotionControllerComponent for the base motion, a skeletal mesh parented to it, and an animation blueprint for driving the joints. Using the MotionControllerComponent is probably the best way to do it, since it receives properly predicted and late-updated data from the runtime for minimal latency.
My hand mesh is set up with the wrist bone at the origin. I move the mesh to the correct location on the controller component programmatically by getting the pose of the wrist bone, transforming it to the controller component space, and setting the mesh position.
I then drive the joints using an animation blueprint. I only use the joint rotations, not the position. I initially passed the data from the Get Motion Controller Data node to the animation blueprint, but setting up nodes for this was very cumbersome, so I ended up modifying the engine to add a new animgraph node that pulls the data directly from the XR hand tracking interface instead.
I still have the blueprint gore from the original method here:
https://blueprintue.com/blueprint/bgcr5jx4/
https://blueprintue.com/blueprint/zdjdb8fi/
Having the joints ordered the same way as in the OpenXR spec helps simplify any programmatic solution.
https://registry.khronos.org/OpenXR/specs/1.0/html/xrspec.html#convention-of-hand-joints
Engine modifications:
https://github.com/EpicGames/UnrealEngine/pull/9748
https://github.com/EpicGames/UnrealEngine/pull/9747
Ancient tweet of it in operation:
https://x.com/rectus_sa/status/1479817838835679234