I’ve only used the handtracking system with Index controller finger tracking, but the OpenXR extension should expose it the same way as full hand tracking. This was a few years ago with UE 5.0, but the hand tracking system doesn’t seem to have had much changes since then. The hand tracking systems provided with the engine are very bare-bones.
The way I got it working was using a MotionControllerComponent for the base motion, a skeletal mesh parented to it, and an animation blueprint for driving the joints. Using the MotionControllerComponent is probably the best way to do it, since it receives properly predicted and late-updated data from the runtime for minimal latency.
My hand mesh is set up with the wrist bone at the origin. I move the mesh to the correct location on the controller component programmatically by getting the pose of the wrist bone, transforming it to the controller component space, and setting the mesh position.
I then drive the joints using an animation blueprint. I only use the joint rotations, not the position. I initially passed the data from the Get Motion Controller Data node to the animation blueprint, but setting up nodes for this was very cumbersome, so I ended up modifying the engine to add a new animgraph node that pulls the data directly from the XR hand tracking interface instead.
I still have the blueprint gore from the original method here:
https://blueprintue.com/blueprint/bgcr5jx4/
https://blueprintue.com/blueprint/zdjdb8fi/
Having the joints ordered the same way as in the OpenXR spec helps simplify any programmatic solution.
https://registry.khronos.org/OpenXR/specs/1.0/html/xrspec.html#convention-of-hand-joints
Engine modifications:
https://github.com/EpicGames/UnrealEngine/pull/9748
https://github.com/EpicGames/UnrealEngine/pull/9747
Ancient tweet of it in operation:
https://x.com/rectus_sa/status/1479817838835679234