This is a hard one to explain so I tried to capture it in a screen video grab. Ive noticed this before in Unreal Projects where when you do a light bake, the reflections don;t always work as they should as you move through the scene. I know that real time engines are not always physically correct but I thought maybe someone could give me some tips to improve this result. I cannot use raytracing in this project since it is an android build for VR. The distracting thing is that depending on the view, the reflection on the floor makes the chair and other objects appear to be floating and gamelike because the reflection removes the shadow. Hopefully this makes some sort of sense.
I’d just increase the roughness on the floor material, actually
What you are looking for is known as specular occlusion. I know that distance field AO offers some specular occlusion, which you can read about here: Distance Field Ambient Occlusion in Unreal Engine | Unreal Engine 5.0 Documentation
I don’t know off the top of my head if the baked lighting offers it spec ao, but I do know that specular occlusion can also be achieved through a pre-computed “bent normal map”.
Bent Normal Maps | Unreal Engine 4.27 Documentation
I assume you’re using the forward renderer since you say this is an android VR project. By default, the forward renderer uses the nearest reflection capture and projects it at an infinite distance from the camera. Obviously this is going to look like crap on any glossy surface.
You can enable “high quality reflections” in your floor material to get parallax corrected reflections, but they’re still not going to be amazing quality and it will cost more. For higher end hardware you could use as planar reflection to get perfect reflections. But that’s most likely going to be way too heavy for mobile vr.
Your best option is to just do what @ClockworkOcean suggested and make the material rougher. At the end of the day you’re just asking a lot of the engine for the hardware you’re targeting.
Hello ClockworkOcean, for some reason I am not getting notifications to responses in my post so I apologize for not responding earlier. I will try this but I am not sure that takes care of this problem. I think it is like Arkiras is saying, I am expecting alot from the engine when doing something for VR. I have much better and more accurate results when I am doing raytraced projects becasue of the more accurate calculations.
Hello Akiras, your assumptions are correct in how I am using this. I may look into playing around with some of the things you mentioned but at the end of the day, I think there are some things I will just need to live with.
Thank you BananableOffense. I was not aware of the terms you mentioned so I am checking into that to see if there is anything there I can tweak to help address.
Here’s another example of specular occlusion using a bent normal map. Notice how without the occlusion many areas appear excessively specular which causes them to look lit when they shouldn’t be. This is extremely common in non-raytraced graphics.
There are probably other ways to achieve it besides those I mentioned, too.
Oh wow, that is crazy! I did not even know what bent normals were. I will need to check this out. There is just too much to learn but each day affords a new opportunity to learn something else.
To be fair, I don’t think bent normals are widely known or used - even if they can be effective. They do require a specialized texture and software to compute them.
But even if you go another route like distance field AO, it’s still a good illustration of what specular occlusion is and why it’s important - and perhaps another reason as to why raytracing is the future.