Rendering High Quality Backgrounds on a Mobile VR Project

I have a workflow question about rendering high quality skies and distant backgrounds on a project for mobile VR (Quest 2).

My base project is set up for VR for optimized performance using things like Forward Rendering, no HDR, no post process, baked lights, etc. I would like to render some backgrounds and skies as cubemaps for static backgrounds in the game, and ideally, I’d like to render them using high quality render settings, enabling things like bloom, atmospheric fog, HDR, high quality anti-aliasing, and possibly Lumen and Nanite.

My question is about workflow and whether it’s possible to have a single project where I could switch between these two quality settings. Since the backgrounds would use the same assets as the game, I’d tend to not want to have two separate projects due to possible confusion in having to synchronize the two projects and keep them both up to date. Preferably, I’d like to build and render the backgrounds in the same project as the main game project.

How do people typically handle this?

I’m wondering if it would be possible to say, have two sets of ini config files for the different quality levels? I realize that it might be a bit slow for me to switch between the two levels, since the engine would have to recompile shaders and that sort of thing. Another approach I guess could be scalability settings, but I’m not too familiar with this.

Any ideas on what might be a good workflow to render high quality stills and keep everything in one project, or synchronize the two project with one another?

You need to look into parallax.

If you set up a couple spheres and the code to get them moving correctly you can generate a nearly accurate long distance panorama from an image.

It’s a bit unrealistic when you move towards it though.
And there isn’t much you can do about that as you’d need to change the perspective of the image based on distance.
You can achieve something by allowing it to zoom in so to speak, but its still noticeble, probably more so in VR.

As far as workflow for baking

You can have as many levels as you want, and you can manually select which levels get packaged in the game, so if you take the time to set up a level to get your sphere map out of, you can then just tell the engine not to bake/cook it up on release.

Personally though, I’d do completely separate projects. Mostly because I’d want custom project settings with custom rendering specific stuff that won’t be avaliable in the forward rendering VR project.

As far as syncronization goes, you need to manually (or automatically) copy the output of the render to the VR project.
You can do this in a lot of ways, a custom NPM package to keep files synced, maybe a version or Rsync. Or just doing manually which is probably the safest way to make sure you have exactly what you want avaliable.

PS:
Why sphere vs cube?
Because the sphere mapping will not distort the mountain range located at around half the sphere at all, and its easier to set up so it fits something similar to the BP_SkySphere.

Thanks for the feedback @MostHost_LA

Neat! I’ve never heard of using parallax effects for static backgrounds. If you know of any documentation to achieve this effect I’d be interested.

Regarding baking, I just tried something hacky and it seems to work. I overwrote the ini files in my VR project with a set of new ones from a Film and Video project. I was able to launch the project with all the high quality render settings enabled. Forward Rendering can be disabled.

Here is what I did:

  • Made a new project with high quality render settings (from the Film and Video preset).
  • Moved all of the config ini files from the VR project to a temp folder.
  • Copied the ini files from the Film and Video project to the VR project.
  • Deleted the Intermediate and Saved folders from the project root.

It seems like I can now switch between the two quality settings by swapping out the ini files.

Regarding Sphere vs Cube, I used a Scene Capture Cube and a Cube Render Target to capture the background and it seems pretty flawless in terms of distortion when used in the material. One thing to note is that I’m using the Camera Vector instead of UV coords to map the image, so the image is mapped cleanly regardless of the skybox mesh topology.

Some useful stuff on skyboxes in UE:
https://www.stevestreeting.com/2021/04/06/skyboxes-in-ue4/