VR head gestures, suggestions?

Hey there,

I’m working on a system which will be able to recognize head gestures with low latency in games. In order to really test the system, I need enough gestures to distinguish between: The more gestures, the harder it gets.
Two gestures are obvious: Nodding and shaking the head.
So far we thought of three more: Evading to the right and left and ducking.

What kind of head gestures would you like to feature in your game? Or which gesture do you think could be useful for other developers?

The recognition will be based on the gyro and accelerometer readings of the DK2.

i guess it would depend again on peoples tollerance to the weight of the rig ( i think its a bit light) but rotations (turning) are useful but a system were a metre comes up and shows how far you are turning on the hud else it would be difficult to re replicate from a player point of view, i hope this helps im more of a vr demo player then a developer.

Hey there Kepakiano, I was wanting to put together a head nod/shake mechanic in VR. I was just curious if you had any advice or tips you would suggest in achieving this effectively?
Thanks!

Hey, yes I found a solution which works really well. However, it is a rather big system because I developed it during my master’s thesis. Here is the short version: using UE4 I created a VR demo app in which the participants had to execute some head gestures which I recorded (gyro and accel). Using time series magic (Google “dynamic time warp” and “dynamic time warp barycenter averaging”) I extracted a representative example of each gesture (the five mentioned in the first post). So for every gesture I have six curves (3 gyro, 3 accel) in a coordinate system.
I then used particles filters to recognise the gestures. Explaining particle filters takes a little too long now, but the basic idea was this: every particle lives in the coordinate system of one of the representative gestures and has a timestamp. When a measurement is taken, the particle is rated: if the measurement matches the curve from the point on where the particle is, the rating is high. Particles with higher rating are more likely to make it into the next population.
If enough particles have gathered around one point in time in one gesture, this gesture is recognised.
Due to the particles moving forward in time all the time, new particles are inserted into the population every once in a while.

The system works really well. New people only had to exercise like one minute and then could easily interact with it. I made a sample game not unlike Tetris where users could move and turn the stones left and right and drop them with the mentioned gestures and it was a lot of fun.

I cannot give you all the details right now since I am travelling. I will be back home in the second but last week of May. If you still need help then, I would be happy to help. Just reply to this thread, I have subscribed it.

You will probably need to implement machine learning with some filter, but gestures are tricky. You’re probably better off doing spherical shells for an interface.

Gestures sounds awesome, consider expanding support to arbitrary position points you can feed in so you can use the gesture library for making shapes with e.g. your motion controllers or other IMUs and not just the hmd point.

Gestures I find common for
head: nod/shake, tilt
hands: circle, swipes

Wow very in depth stuff. I banged together a quick nod/shake gesture check based on detecting rotation velocity in one direction, setting a timer, then checking for a matching axis rotation in the opposite direction. It’s like the caveman version of the ideas above, but for my purposes it works well enough. It would be awesome to put something more robust and versatile together in the future. You may want to consider bundling up your solution and selling it on the market! Thanks for the tips!!
-p

Hey patch24, I’m trying to hack something similar together as well. Did you do this all in blueprints? Any further information would be great! Thanks