I’m starting to work my way through the tutorials for UE4. I have seen how the Rift can be used to direct the position and location of the camera.
Is there any support for gesture recognition from the rift? With iPhones the API has gestures that are recognised and triggered - swipe left, swipe right, pinch etc etc. How easy would be it to incorporate gesture recognition into the movement controller for the rift?
In addition to standard stuff, knowing if the user nodded “yes” or “no” responses during dialog, I would also be interested in knowing if we could have a user “nod” guesture that also supplied the direction nodded as a vector. Why? If I was writing a game that wanted to introduce a head nod gesture to use within a game mechanic. For example, if it was a squad based FPS game, when commanding your support guys might ask where to position and your head nod could indicate (roughly) where they were to stand - just like you do in real life, pointing with your head when your hands are full.
Or perhaps its a “on my command”-type trigger initated by a particular head movement.
So, thats the background context - does UE4 give us any gesture support for the rift in the API and can we get a direction vector from the gesture?