Hello,
I recently went to Unreal Dev Days and it sounds like support for motion controller meshes provided by OpenXR is in the works. This is great news, but I have questions about how to take advantage of that information when we don’t know about the players devices ahead of time.
What I’m wondering is if it’s possible to look for the bound action, find the devices buttons and input they’re using for that action, and find those buttons on the OpenXR provided meshes to do something, like highlight them? This would really help simplify tutorialization and allow us to inform the user when they have hardware that we never knew existed, changed their input for those specific actions, or just for accessibility reasons.
Right now, the closest thing I’ve found to this is using SteamVR to just show the controller diagram.