I just had my first day experience with Unreal. I’m an experienced dev/programmer (10 years AAA, 6 years indie), checking out UE4 for the first time. I’m doing it because I set up roomscale VR via the Vive in my living room last week and I’m ready to start prototyping some ideas with the intent of shipping a real VR game on Steam. I thought you might like to know what it looked like having this game programmer onboard UE4 with the intent of using it primarily for VR. I imagine people come to Unreal just to use it for VR and want to get up and running quickly!
First off, UE4 has full source and the ability to write native code for perf. That’s why I’m checking out UE4. Serious developers should be attracted to Unreal, given the litany of engine choices which do not offer these benefits. I want to build great looking games with 12ms frames, get under the hood and make improvements, and be able to profile every last corner of the engine so shipping and compatibility are smooth sailing. I think Unreal will help me serve customers better than any alternative.
All that said, onboarding via Vive VR development was a bit rough and may be creating friction that scares away new devs. Here’s how it went for me:
-
Read all the basic engine tutorials; art, level design, blueprint, native code. Great. Sync up and read engine source. Awesome.
-
Discover there is a garbage collector that collects all UObjects in C++ code. Okay – garbage collectors aren’t inherently evil, and they vary in their implementation in a large way. But there is very little easily accessible information on how it works under the hood. Knowing I am investing in VR tools that can pause and clean up objects is a bit concerning when you are targeting 90fps. There is very little information on why this is a good thing or how to manage it. Honestly, I would rather just free the memory myself to avoid time lost walking the heap. Community response seems to be split into “dunno, it works for me” and “Yeah, I had trouble with that and played around until it worked.” Neither is a good substitute for quantified data and experienced advice plyed at scale. (Any Coalition developers want to weigh in on this after you ship Gears?)
-
Read the SteamVR Quick Start docs. This was a breeze and got me going.
-
Read the “Motion Controller Component Setup” doc. This is where I spent most of my time. A day later, I have at-scale, rendered controllers in a properly set up VR room space. Along the way, I ran into the following:
- Can pawns receive keyboard/trigger input? (I still don’t know). This tutorial didn’t work until I re-subclassed VR_Pawn as a Character instead of a Pawn. Then my blueprint just worked.
- The video links at the bottom were broken, so I went it alone without any of that. I later discovered they’re simply broken Youtube links, but I haven’t watched them yet.
- The further I got into it, the more I realized getting a motion controller behaving like the one in the Vive dashboard was an involved process. For instance, the trigger properly animates there and there is a dot on the thumbpad that mirrors the touch of your thumb. It would be fantastic if there was a derivable class for this specific controller, rather than building it from scratch to mirror what Valve does. (And if there is, it’d be great to know where it is).
- I had the floor move up again a few times in development to its default of 10cm (should be 0cm for roomscale). Because I’m a total newbie, I didn’t realize that I had to load a level when I re-loaded the project. Eventually, I figured this out, but it could be mentioned earlier in VR docs because VR should be an onboarding path for UE4, imo!
- The subtraction node in the “remove objects” of the MotionController tutorial was pretty confusing since it isn’t labelled and I wasn’t sure how to replicate it. Could be easily solved by posting a template of the finished example. I was able to logically deduce what it was after staring at it for a bit.
- The “Vive” controller model in the Engines art package was rotated 90 degrees, was 10x too small and was from the original developer Vive. (It’s out of date and should be deprecated).
- I had a heck of a time getting the controllers to orient correctly. (Their logical positions, not the visual models). I ended up parenting them to a Scene component at origin. This was not explicitly listed in the Motion Controller Component Setup and caused one of the longest troubleshooting sessions of the day. Prior to this, they were translated from an origin and rotated somehow, in spite of everything being set to model origin.
- When I went to import the Vive controller model myself, the Unreal Blender import documentation suggested specifying the rotation of the Vive controller model at export. This did nothing to my model; I had to set the “Import rotation” yaw to 90 degrees and re-import. This worked.
Anyway, that’s the end of my first day with Unreal and my first day developing for VR. I’m in a properly scaled room with properly positioned and scaled controllers. It’s possible I missed doing something the easy way, but that’s part of onboarding feedback.
Looking forward to day two.
Labbe