The cheaper “VR Friendly” version is just BumpOffset. The only other method is using POM with low iterations (<8). Once you drop POM to 1 or 2 offsets it basically turns into bump offset quality. You can limit the steps using a distance mask like “CameraDepthFade” and this can save a lot of perf.
Self shadowing isn’t going to be feasible in VR unless you pre bake the shadow response into the base color. I have used the shadow output of the POM node and used it to re-bake the diffuse with minor shadow details like for pebbles. The only possible way for that to work realtime is using Contact Shadows which I am not sure if they work in VR.
haha, doesn’t have to be photorealistic just has to solve all the hardest edge cases of POM in the most performance constrained environment
I have a curvature solution but it requires baking out another unique texture for every mesh and doing a double texture lookup at every step. so it make it 2x as expensive which isn’t ideal for VR. basically it needs to be able to read the transform at any point on the mesh… BUT this solution does not work with UV seams unless they actually connect, ie a pillar would work if the seam is at perfect 0,1.
The only solution that would work the way you want is using some kind of Prism based rendering method which is insanely slow, definitely too slow for VR. That is how crytek does their I believe. It basically means you render a little volume shaped like a triangular prism for every single poly on the mesh and overdraw them all. We don’t have any easy way of doing that in ue4 currently and there are no plans to investigate a solution like that any time soon because its so expensive. The original paper on that method measured seconds per frame.
Automated curvature solutions are possible by using quadratic approximations but they assume constant curvature once the ray enters the surface and have lots of artifacts so I am not pursuing those solutions. The only one likely to get any attention is the prebaked tangent map one I mentioned above.