My onboarding experience with Unreal, using it for roomscale VR

I just had my first day experience with Unreal. I’m an experienced dev/programmer (10 years AAA, 6 years indie), checking out UE4 for the first time. I’m doing it because I set up roomscale VR via the Vive in my living room last week and I’m ready to start prototyping some ideas with the intent of shipping a real VR game on Steam. I thought you might like to know what it looked like having this game programmer onboard UE4 with the intent of using it primarily for VR. I imagine people come to Unreal just to use it for VR and want to get up and running quickly!

First off, UE4 has full source and the ability to write native code for perf. That’s why I’m checking out UE4. Serious developers should be attracted to Unreal, given the litany of engine choices which do not offer these benefits. I want to build great looking games with 12ms frames, get under the hood and make improvements, and be able to profile every last corner of the engine so shipping and compatibility are smooth sailing. I think Unreal will help me serve customers better than any alternative.

All that said, onboarding via Vive VR development was a bit rough and may be creating friction that scares away new devs. Here’s how it went for me:

  1. Read all the basic engine tutorials; art, level design, blueprint, native code. Great. Sync up and read engine source. Awesome.

  2. Discover there is a garbage collector that collects all UObjects in C++ code. Okay – garbage collectors aren’t inherently evil, and they vary in their implementation in a large way. But there is very little easily accessible information on how it works under the hood. Knowing I am investing in VR tools that can pause and clean up objects is a bit concerning when you are targeting 90fps. There is very little information on why this is a good thing or how to manage it. Honestly, I would rather just free the memory myself to avoid time lost walking the heap. Community response seems to be split into “dunno, it works for me” and “Yeah, I had trouble with that and played around until it worked.” Neither is a good substitute for quantified data and experienced advice plyed at scale. (Any Coalition developers want to weigh in on this after you ship Gears?)

  3. Read the SteamVR Quick Start docs. This was a breeze and got me going.

  4. Read the “Motion Controller Component Setup” doc. This is where I spent most of my time. A day later, I have at-scale, rendered controllers in a properly set up VR room space. Along the way, I ran into the following:

    • Can pawns receive keyboard/trigger input? (I still don’t know). This tutorial didn’t work until I re-subclassed VR_Pawn as a Character instead of a Pawn. Then my blueprint just worked.
    • The video links at the bottom were broken, so I went it alone without any of that. I later discovered they’re simply broken Youtube links, but I haven’t watched them yet.
    • The further I got into it, the more I realized getting a motion controller behaving like the one in the Vive dashboard was an involved process. For instance, the trigger properly animates there and there is a dot on the thumbpad that mirrors the touch of your thumb. It would be fantastic if there was a derivable class for this specific controller, rather than building it from scratch to mirror what Valve does. (And if there is, it’d be great to know where it is).
    • I had the floor move up again a few times in development to its default of 10cm (should be 0cm for roomscale). Because I’m a total newbie, I didn’t realize that I had to load a level when I re-loaded the project. Eventually, I figured this out, but it could be mentioned earlier in VR docs because VR should be an onboarding path for UE4, imo!
    • The subtraction node in the “remove objects” of the MotionController tutorial was pretty confusing since it isn’t labelled and I wasn’t sure how to replicate it. Could be easily solved by posting a template of the finished example. I was able to logically deduce what it was after staring at it for a bit.
    • The “Vive” controller model in the Engines art package was rotated 90 degrees, was 10x too small and was from the original developer Vive. (It’s out of date and should be deprecated).
    • I had a heck of a time getting the controllers to orient correctly. (Their logical positions, not the visual models). I ended up parenting them to a Scene component at origin. This was not explicitly listed in the Motion Controller Component Setup and caused one of the longest troubleshooting sessions of the day. Prior to this, they were translated from an origin and rotated somehow, in spite of everything being set to model origin.
    • When I went to import the Vive controller model myself, the Unreal Blender import documentation suggested specifying the rotation of the Vive controller model at export. This did nothing to my model; I had to set the “Import rotation” yaw to 90 degrees and re-import. This worked.

Anyway, that’s the end of my first day with Unreal and my first day developing for VR. I’m in a properly scaled room with properly positioned and scaled controllers. It’s possible I missed doing something the easy way, but that’s part of onboarding feedback. :slight_smile:

Looking forward to day two.

Labbe

Hey , thanks for all this great and detailed feedback. I’ll see if I can help with some of these.

The problem here may have been possession and Keyboard Input being enabled. By default all actors/pawns/thingys have “Auto Receive Input” turned off, but that can be changed in the class’ default values under the “Input” settings.

That’s a known issue that the docs team is working on, sorry about that.

That’s going into my Feature Requests list. Probably makes sense to do a couple of these for the major controllers. I’ve seen people with their own versions of this, so I know it’s not too complicated to make yourself, but I can see how it would be nice to have pre-made.

Alright, I can see about what can be done there.

Ah, yes, it’s context sensitive to the Array output, so I can see how that wouldn’t be obvious at first look. If you ever get confused by a node like that, try unchecking “Context Sensitive” in the search menu and see if it comes up. Class-specific functions will also only ever show up when pulling from an instance of that class, so be aware of that too.

That sounds like it needs to be swapped out then, I’ve informed our developers of that.

Can you give some more details on the setup here? A screenshot maybe of your components? I’d like to take a look at it and see if that was intentional or not.

That sounds like a bug with the importer tools, can you report that on answers.unrealengine.com? That would help us a lot

I hope some of that helped! Thanks so much for the feedback :smiley:

Thanks for the pointers!

Sure! Here’s what ended up working for me. The “Scene” component was essential to the correct logical position of the controllers.

working_setup.png

Yep, that was wrong, the docs should have said to add an extra scene component. I spoke with Sam and he fixed the docs info. Thanks for bringing that to our attention :slight_smile: