HEADTRIP Menu: Hands-free UI demo for Oculus Rift

As soon as I tried an Oculus demo that implemented some kind of gaze-based interaction I was pretty intrigued by the potential. I had some ideas as to how gaze-controls could be expanded on for a menu interface, and I put together a short demo. It’s all done with blueprints. UE4 was a great way to hash this out, though the odd rotator behavior in blueprints was a major headache for the 360 menu rotations you see in this demo.

This video will give you the basic idea:

You can download the demo here:
www.mostancient.com/headtrip/
(~60MB, Windows only, only tested on DK2)

Screenshots attached. Feedback would be great. Thanks!

Neat idea, Looking forward to seeing where you take it.

Look1 Great!

Just played this. UI was most excellent. I’ve been experimenting with using ray-casts to actuate(damage) blueprint objects for UI and I’m not entirely happy with how it turned out.

Would you be willing to share a little bit about how your cursor/buttons are set up?

Very well done
I like it for menus - for game play it would strain my neck - if I played for long.
I like the “Look straight” to recentre in the beginning - not needing to find a keyboard key is nice. I did manage to activate it one time while the rift was just on my desk
It could be interesting to see how you would implement sliders or drop downs, and please put in a FPS number just for info
I second Alexotronic - I would love to see you blue script / or the hole project on github

Hi Alexotronic. I’d be happy to share some blueprint screenshots. It’ll take some tidying up. I’ll try and take some presentable shots within the next few days. Thanks for checking it out.

Yeah, I was definitely worried about neck strain and tried to make things require as little head movement as possible. I do think this sort of interaction could be used for gameplay, but sparingly. Higher resolution would let me condense the layout a lot more, which would reduce the amount of head wagging you have to do. So hopefully this whole approach can be vastly improved with the next hardware iteration.

Shoot. It’s a tough thing to perfect. Ideally HMD calibration/positioning would be part of a launcher or an OS, so having it in front of an application is kind of cumbersome. But I’ll see if I can make that part more failproof.

I’ve got some ideas there, I think it could be done. Next VR thing I release will have an FPS counter for sure. I agree that that’s essential.

I’ll post some blueprint stuff within the next couple days. Let me know if there’s anything in particular you’re curious about. I don’t really have time right now to polish it enough for a public repo. I kind of got carried away on this menu thing and need to focus more on the actual content of our first VR project. I would love to see this approach incorporated into an open source UI project for UE4 VR though. I think UI is a crucial piece of the puzzle for mainstream VR adoption, so initiating some sort of concerted effort to get it right would be great.

Sorry for the delay. Finally got around to taking some screens. I’ll just go over the main aspects of the gaze/nudge functionality for the sake of closure on this thread.

[See “HT_PlayerController” Blueprint screen]
–First, I set up a trace in the Player Controller Blueprint. I saw in a video in the UE4 youtube channel (can’t remember which one) how to trigger functionality on an object using interfaces. So that’s what happens here. Whenever the trace hits an object that has an “OnGaze” event, it will call it. If it doesn’t have that event, nothing happens.

[See “Snag_LowerRing_Arrow_Left_BP” Blueprint screens]
–To get the reticule to nudge a gaze-sensitive trigger, I start tracking the camera rotation difference per-tick as soon as a gaze-sensitive trigger is hit by the gaze trace. I only check for differences in a single rotator component (yaw in the case of the navigational arrows in my demo), which makes the trigger sensitive to only a single axis of head movement. Once I’ve found the rotational difference, I use it to adjust the look-at-rotation between the trigger and the player location (which is the same as the menu location since it’s a sphere centered on the player). With the look-at-rotation adjusted I can then calculate a new vector at a fixed length, which gives me the new trigger location. This way the trigger always stays the same distance from the player, and its movement follows the curvature of the menu sphere.

Impressive! Thanks for sharing!

Thanks so much for sharing. I was really looking forward to that. It’s a really nice setup you’ve built.

thats amazing. was trying to figure exactly this out in the last days. almost every part of it. haha
will give it a try and hope it gives me a hint what to do.

thanks a lot

Hi there, I just stumble upon your project and it is amazing. Alas there I cannot download the project on your website as it’s not there anymore.

Can you upload it somewhere again if you still want to share it ?

Thank you again.