Virtual Reality and Unreal Engine: Where We Are & Where We're Going - Apr. 16, 2015

It’s been a long while since we’ve discussed VR on the stream, and using our new video equipment we’re able to virtually drag our Lead Programmer on VR, Nick Whiting, into the studio from Seattle. Also joining us is Sr. Designer Nick Donaldson who has been instrumental in all of the VR initiatives at Epic as well. Hang out with us to hear about the current VR capabilities in Unreal Engine, a quick look at what’s ahead, and how you can get started developing your own great VR experiences today. As always, we’ll be taking questions here and on the stream, so if VR of *any *sort is your thing, you won’t want to miss it!

Thursday, April. 16th @ 2PM ET - Countdown]


Chance Ivey - Community Manager - @iveytron](
Nick Donaldson - Sr. Designer
Nick Whiting - Lead Programmer, VR

Questions about VR? Drop em below!

Edit: The YouTube archive is available here](

Hey Chance!

Here are a few questions i have regarding UE4 and VR.

Not sure if this is currently supported but if not are there any plans to add VR Editor support so we can actually edit with the headset on? - Right now it’s kind of a pain to go back and forth between headset and screen to see how changes look on the headset.

I also noticed that (with the DK2 at least) it does not seem to render refractive materials but if you switch to a different render mode then back to lit it will render them for a few moments - Is this an engine bug or somethingto do with the DK2 software and will it be addressed any time soon?

Hello Chance
I’m really looking forward to the VR stream!
A question: Could the stream please shed some light on the state and plan for UE4 integration of the Oculus Audio SDK? Especially the use/need for second party middleware.
And another question: What is the state of DX12 and VR, and what is the timeline (will there be a branch of UE4 for the windows 10 preview with the DX12 drivers working for VR)

Hi Chance,
A few questions for the guys:

  1. Have the suggestions from Valve’s Alex Vlachos’s GDC 2015 presentation (talk here, slides here) been incorporated into UE4 or will they be in future? (particularly the use of a stencil mesh/optimized warp mesh to stencil out the pixels you can’t actually see through the lenses (which he cites as giving a ~30% performance improvement, which would obviously be incredibly useful for VR).

  2. Will we see expanded blueprint objects giving easy access to currently unavailable data (e.g. the player’s real world height, gender, ipd etc from the Oculus config utility) and data which currently has to be derived but would be more useful to have direct access to (e.g. the location in world space of the player’s eyes, the point midway between them and the location of that point when the HMD was last recentered, along with the world space orientation of the hmd, and the worldspace location and orientation of the tracking camera)?

  3. Are there any planned improvements to the rendering of mirrors in VR? (or if there is a way to get these working perfectly already, could we get a content example?)

  4. Are we going to see some simple official VR templates with suitable rendering settings and examples of options for things like independent head/body facing, getting in and out of vehicles, 3d ui etc?

Many thanks,

dedicated vr hardware is expensive and uncommon, the ‘invention’ of google cardboard has made virtual reality within the reach of a whole lot more people.
the major drawback of this evolution is that ue4’s Android performance and compatibility leaves a lot to be desired.
and the question,
are there any plans to improve this very important area, if there are will it be within a reasonable timeframe?

thanks, hope somebody can answer

Hyped for this stream! I have a few questions concerning Gear VR specifically. I am using UE4 for the Mobile VR Jam so any wisdom you can impart is much appreciated.

  1. Lighting
    Can you go over what you’ve learned about lighting and Gear VR? There are three areas I’m curious about:
  • Fake lighting. Any tips for achieving nice results through material trickery alone?

  • Static lighting. Lightmass has been a pain point for our team. Any general tips or considerations for mobile?

  • Dynamic lighting. Do we have this available on Gear VR? Is it feasible to combine small instances of dynamic lighting alongside static and/or fake?

  1. Profiling
    Can you explain some best practices for determining performance bottlenecks on Gear VR?
    Most of the ‘stat’ commands are unreadable. So far we have been using ‘logcat’ wirelessly to watch debug messages, and dabbling with ‘stat startfile / stopfile’ to get a profile dump from the device.

  2. Performance
    Any particular performance recommendations to maintain 60 FPS, for example drawcalls, number of triangles/verts, etc? The Oculus Mobile SDK has some guidelines but they seem specific to other engines and may not be applicable. For example, we don’t have explicit batching in UE4, which makes these numbers harder to follow. It may be worth mentioning the CpuLevel and GpuLevel settings.

  3. Transitions
    What is the best way to transition from Splash Screen -> Main Menu -> Level 1 with minimal hiccups from the engine? I assume we should be using level streaming for this, but I don’t have much experience in that area.

Very hyped for this stream!

Not necessarily about VR, however, it will, or should, be extremely important for VR in general:

  • DX12 support?
  • Build for Win7/8/10 rather than Vista… is there a reason behind this madness?
  • Can we please, pretty please, get 64bit architecture w/o having to do a full engine build?
  • Real-time dynamic (insert xyz here, namely lighting) improvements / optimization

Now, with those being out of the way, here are some questions directly about VR:

  • UMG not playing nice with VR, is there an ETA for fixing this?
  • It seems that most PC/Console HMD’s will be going with dual screen displays… will we start seeing optimizations for this?
  • Render viewport (or even the entire engine? :wink: ) in VR
  • Support for dual, or should I say tri, rendering… one view on the screen, another in the HMD
  • VR Template ETA
  • Player controller blueprint/class/w.e in the “Starter Content”
  • Is there a chance for a “Camera to Texture” blueprint to allow easy-access for a camera pass through
  • There seems to be a scaling perspective issue; basically, if someone is say, 6ft tall, the character is 5 feet tall, the person would perceive everything as big, and likewise, the opposite holds true. Another thing is perceiving scale while sitting/standing… is there a way around this? The Oculus desk demo does a good job of this. How do we fix individual perception of scale?
  • LiquidVR implementation? expected performance?
  • Oculus audio?
  • Normals not work if we are near the object, what its the best performance for avoid flat surfaces in VR?

Bonjour !
Now it’s possible to capture a 360 image.
Any plans for 360 sequences ?
And beyond: 360 stereoscopic sequences ?

Cheers, see you tomorrow !

  1. are there any plans to make profiling in VR easier? i.e profile data on the desktop as opposed to 6" from your face in VR.

  2. Due to downsampling involved in VR which leads to a massive G-Buffer; Are there any plans for a forward renderer?

  3. Any other plans for rendering improvments such as a unified camera?

looks like almost everyone is concerned about performance in one way or another

Hi, my question. VR controllers such as the Vive and sixense. When will UE4 have the plugins as standard part, like the rift support.

Thank you

Awesome, Can’t wait for this.


1. I understand the transition to VR takes work, and that we’re all still improving on the implementations. Though, is there anything in particular limitation in Unreal Engine 4 in regards to VR? Or are the general limitations quite the same within Unreal Engine 4? I just would like to understand where I can cheat, and where I can’t. Performance Vs Fidelity.

2. People are wanting to playback footage & audio within UE4. That is currently capable, but would there be any push for 360 Stereoscopic VR support?

3. 3d Position Tracking grants 3d Position Audio, but will there be a push for 3d positional audio specifically for VR. As an example what physical recording devices do to create binaural audio is by taking account for head shadow - the head augments sound waves, and ear separation. Is there anything that can be done specifically?

4. Also I understand Unreal Engine 4 can only read MP4 @ 25FPS. Is there any general limit for video playback? If I wanted to play 7k 360 footage recorded at 75FPS in VR game environment, what kind of performance hits am I looking at?


1) Have you considered optimizing the engine for VR by rendering only once parts of the scene that are far away and have no stereo disparity? I believe that in big open worlds like the “kite demo” this technique could provide massive savings.

2) Is there currently a way to enable asymmetric gameplay using only one PC? For instance having one player in FPS mode inside VR using a controller while having a 2nd one overviewing the action on a monitor using mouse and keyboard? If not, would you consider adding this option?

Thanks a lot! Looking forward to this stream!

A few questions :slight_smile:

  1. What are the plans for a dedicated/optimised VR rendering path? Other performance related improvements?
  2. Native integration of oculus 3D audio SDK? (e.g. not as an FMOD plugin)
  3. What improvements are being worked on for GearVR?
  4. Official GearVR gamepad support in UE4?
  5. Official Leap Motion plugin availability for mac?

Really looking forward for the stream!

I am also interested in **Performance **in VR.
300 fps non-VR sometimes drop to ~60 fps in VR.

Looking fwd to this stream.

Most questions have been asked already. Many thanks again for JJ and Dave for making the engine “ready” for the Jam.
…can we expect a 4.7 “hotfix” or update soon that will do all this via the launcher

Carmack and Valve have recently spoken on the importance of good anti-aliasing for VR, and specifically mentioned a preference for MSAA using forward rendering.

UE4 currently has two desktop AA solutions that I know of-- FXAA, which does not give good quality, and Temporal Anti-aliasing, which is prone to errors (movement from pixel shaders which can’t be accounted for by the temporal process, movements which are accounted for but are too fast, translucent surfaces).

Are there any plans to improve temporal anti-aliasing to match the quality of MSAA in real-world use? Are there any current(I think it was discussed last year?) plans to allow for forward rendering on desktop?

Cool, thanks again for doing this. Lots of interesting things coming up on 4.8 and beyond.

Feeling good about choosing UE4 for VR development! :slight_smile: