How to interact with UI using OpenXR hand tracking in Unreal Engine?

Hi everyone,

I am working on a VR open-world project in Unreal Engine, and I am using hand tracking in this project.

I first implemented hand tracking using the Meta XR plugin. It works perfectly, and I can also interact with UI widgets. But this setup only works when running through Meta Horizon Link or as an APK on the headset.

I want to run my project as a PC (.exe) using Virtual Desktop. However, when I try this with Meta XR, the project crashes or gives a fatal error. From my research, Virtual Desktop does not support Meta XR hand tracking features.

So I switched to the OpenXR plugin. Now hand tracking works in Virtual Desktop, and the hands are tracked properly.

But now I have another problem: I cannot interact with UI widgets using my hands. The interaction that worked with Meta XR is not working with OpenXR.

So my question is:
How can I interact with UI (buttons, menus, etc.) using hand tracking when using OpenXR in Unreal Engine?

Is there any simple way or method to do this?

Thanks!