Hey there! I’ve been diving into Hand Tracking for almost 2 years now. Recently, I finally got the time and the right projects to implement these systems.
I’ve used the tutorials from Just2Devs to learn how to implement it, and the hand tracking setup is basically the same as the video you reference - Just2Devs Hand Tracking Grabbing Objects.
For the grabbing part, instead of their setup, I created a hybrid grabbing system, leveraging the Original VR Grab System/Component from the VR Template.
This setup allows me to seamlessly switch between using both hands and controllers (though not simultaneously, although I’ve heard it’s now possible—using hand tracking with controllers as random objects in the scene, pretty cool). I can apply the same grabbing system to both scenarios.
To implement this setup I have changed the input type of “Motion Controller” (GREEN) from the “Get Grab Component Near Motion Controller” to a Sphere Collision Object (RED).
-
This means I changed where the grab detection is done - In this case instead of the grab detection being done by the motion controller, is made on custom sphere collisions I created for the hands (both hand tracking or controllers).
-
I made this because the motion controller in hand tracking is placed on the wrist, and in controllers it is placed on the palm of the hand mesh.
-
If you are only using hand tracking, you can simply create a sphere collision in the palms, and that’s it.
Similar to how Just2Devs do it, in the Pawn’s Begin Play, If Hand Tracking is detected - spawn the Hands and Hide the Controller Hands.
If controllers are detected - despawns hand tracking hands and turn visibility back on on the controllers hands…
- Either way, I set the Object “SphereGrab - Left (or Right)” to be used in place of the previous “Motion Controller” in the image before.
Making it possible to use the different positions to cast the “Get Grab Component Near Motion Controller”, depending if I’m using Hand Tracking or Controllers.
This is my Index + Thumb (basically a pinch) setup to detect the distance between the finger tips - similar to the video.
But I added the logic to make it work with the grab system.
However, I’ve hit a snag when it comes to creating grabbable door handles (or any handle, for that matter) using the OG grab system from the VR Template, especially when these doors/handles are involved in Physics Constraints.
I’ve come across some videos showcasing alternative ways to tackle this in VR, which seem like a better solution based on the impressive results in those videos. Unfortunately, I haven’t had the time to set this up myself yet.
Here are the videos for your reference:
1 - Video 1
2 - Video 2
3 - Video 3
Despite the alternative approaches, I’m inclined to stick with the original Grab System. Given that I’ve already made it functional with both controllers and hand tracking, I think it might be “easier” to combine this grab system with a door setup similar to the ones in the videos.
I understand it’s not a definitive solution, but that’s where my thoughts are leading me in terms of achieving the common objective.
If it’s confusing to anyone or if I missed any key aspects, reach out and I’ll try to help!
Hope this sheds some light on the situation. Thanks for your attention!