As designed, camera fade (using Set Manual Camera Fade, Start Camera Fade nodes) doesn’t work without mobile HDR.
There are known workarounds like placing transparent plane/cube in front of camera or using HUD widgets but they will cause overdraw which will affect performance.We’re targeting Gear VR so every additional draw call counts, and hacks will cause full-screen overdraw of each fragment on screen.
What we need is simply a possibility to multiply texture containing rendered scene by a given color - for GPU it will cost nothing and will allow users to implement cheap coloring of scene.
8 months later, and I’m still looking for a good solution to this. I’m also making a game for the GearVR, and right now, there’s no way to fade the camera without enabling HDR, which simply isn’t a good solution (maybe that will change when Vulkan gets pushed via Nougat?) I’m trying to figure out how to even get an actor to fade opacity, just to have something that works in the meantime. Hopefully we get a decent solution soon
I haven’t tried this yet but I plan on creating a master material that all other materials from the scene derive from and add a fade parameter that multiplies the base and emmissive colors.
One option is to use Color Scaling. It’s in the Oculus Function Library. The Async Timewarp Compositor can take this scale value and multiply each pixel by it. So you can setup a blueprint with a Linear Color and a Timeline with an alpha float and play it forwards and backwards, setting the color scale value each update. I can confirm this works on Oculus Go and you do not need to have any Compositor layers (see the Oculus Layer Example) specified. It works on all pixels about to be rendered.