Mixed Reality Slider: A natural user interface to blend between VR and AR using Meta Quest 3 and UE5

Hi everyone!

I love working in XR. It feels magical. I think this is how David Copperfield and other magicians must have felt like while they were doing their tricks in front of an awestruck crowd. You just make a hand gesture and a wall disappears or something else unexpected happens.
I had a lot of this feeling when working on my latest natural user interface the Reality Slider with UE 5.3 and Meta Quest 3, which I would like to present in the YouTube video below.

In my line of work, I often feel like switching between VR and AR in an intuitive way. Sometimes you want to be in AR, sometimes you want to be in VR. Adn this has to be intuitive and feel naturally. No controllers, just using your own hands. This system works very well with the Meta Quest Hand Tracking but I am planning to streamline to OpenXR.

20240121 Ferenczi XRP Meta Mixed Reality DigTwin Slider3

YouTube: Reality Slider

If there is interest, I will write a high level technical article about how it works, what Quest SDKs are used and how the setup the project to compile on Unreal Engine.

If you want to reach out and connect, looking for a freelancer to support your XR project in Unreal or just want to chat, feel free to reach out:

ArtStation: ArtStation - Zoltan Ferenczi, PhD
LinkedIn: https://www.linkedin.com/in/dr-zoltan-ferenczi-5b2ba884/

Zoltan Ferenczi (BreakMaker)