Hi all,
First time here, and very new to Unreal, so my situation is a mix of unfamiliarity with Unreal and a fairly specific rendering need. Also not sure if this belongs in the forums or the AnswerHub…
I’m trying to create a walkthrough of an architectural project with a focus on an accurate representation of atmospheric daylighting. Indirect lighting is key, of course, and I’m more interested in approaching physically accurate rendering than in artistic control of the lighting. There might be some static artificial lighting in the building, but mostly I’m concerned with sunlight, which needs to be able to change direction. The geometry can almost certainly be completely static. I’d also like to be able to use the Oculus Rift for experiencing the walkthrough, which has limited somewhat the tools I can use, and is part of the reason for trying Unreal. It’s only going to run on one computer so I’m not concerned about scalability. That said, I don’t have the top of the line, so performance is unfortunately still part of the discussion.
I’ve had two thoughts about how to make this happen, but of course the main goal is to get the above working and the questions below are only the ones I think I need to ask! Perhaps I’m headed in the wrong direction.
The first option, I think, is to use LPV’s for fully dynamic lighting. I’ve read the tutorials and gotten this working, more or less, just a few questions. Are LPV’s using spherical harmonics? After reading what I could find, my basic understanding was that SH’s were something you could pre-compute, and then quickly evaluate at run-time. However, in Unreal the lighting seems to be calculated as I move around - for example corners are black when I turn to look at them and then gradually brighten after a few seconds. Is this because the geometry is assumed to be potentially dynamic? Is there a way to get better/more accurate lighting by somehow telling the engine that it doesn’t need to worry about the geometry changing (Just shooting in the dark here)? Finally, are the default LPV settings the most “accurate”, or do the most physically accurate values for those controls change depending on the scene?
Another option I thought of was to bake the lighting using 3ds Max. Since the lighting changes, but the path of the sun is predictable, I could create a video from a sequence of bakes throughout the day. I could then use this plugin to display that video as a texture on the building. Is that feasible? This would be equivalent to having a Lightmass bake as far as accuracy goes, I think? Would I then use environment probes to create reflections?
Not sure if it’s ok to talk about other engines, but a related question is whether Unreal is the best tool for this. It feels like there is a lot of Unreal that I’m not making use of (e.g. ALL interactivity, except moving around) that might make Unreal’s focus different from what I need. Maybe something specifically set up for static geometry, like Renderlights would be better optimized? I haven’t gotten a chance to test out that engine yet though.