Using Oculus Hand Tracking to interact with UI directly?

I’m trying to make a touch screen in a VR project that has Oculus Hand tracking and interaction, basically trying to make a touch screen menu in game. All the tutorials I’ve found so far use the laser pointer as an interaction, or mouse cursors; what I’m trying to achieve is for the hand mesh to press the screen (in this case I thought of using a Widget UI, any other ides?) and have the buttons work like that.

Any ideas?

Currently using UE 4.27

EDIT: Very important note, I have the hands within the VRPawn, I did not do the set up, it was my colleague, I am only using what worked for him and the hand tracking and interaction works fine with the hands in the Pawn, however every tutorial with the widget interaction doesn’t work for me

1 Like

I gave up on trying to use a Widget and used colliders and event dispatchers to simulate button presses. Works like a charm.