Has hand gesture input been implemented for Vision Pro?

I’m curious to know if it’s possible to get hand input or tracking on AVP or if there are plans to implement this.

Hi!

Hand Tracking works through our OpenXR Hand Tracking API - it gives you access to each joint transform, and you can build gesture input with that data. However, the visionOS gesture API is locked under the Reality Kit-renderer and is not accessible to us when using Metal. We don’t know if or when it will be available to us.

How about getting input from a game controller connected to the vision pro or the ps vr 2 controllers, have you had a chance to try those with vision pro?

Game controller works, but we’ve yet to implement support for PSVR2 controllers. It’s on the roadmap, but we have no timeline to share.

Got it. Thanks