What I am seeking to do is create hand poses at editor time with hand tracking working in PIE or VR Preview mode, instead of manually rotating each bone on the hand Skeletal Mesh for each object that can be picked up.
Dilmer Valecillos does exactly this with Unity and Meta’s SDK in the video here.
I’m still not clear whether hand tracking can be enabled over Link so we can use hand tracking at editor time. Apparently this can be done with MetaXR plugin starting with version 59 / UE 5.3 but unfortunately my project is using 5.1.
There is a node
UHandRecognitionFunctionLibrary::RecordHandPose
in the HandsTools plugin, but my understanding is that it only records a sequence of poses, each a string (encoded) of bone rotations. Even if I were to use this node the output goes to the Quest’s Unreal log file so getting those gestures to the Unreal editor is cumbersome. After that it’s creating an animation sequence for the pose with the hand SKM bone rotations set to the values from the encoded pose. This would be streamlined with editor time hand tracking. Any ideas?