I’m working with photogrammetry scan data of a cave, have set scale in meters, fbx’s drop into UE4 seamlessly, maintaining 70-100+ fps on a GTX 980 Ti with 17 million polys and some 36 4K texture maps. I’m managing placement and attenuation radius of lights, but not sure this matters, can’t yet bake lightmaps because UVs for the photogrammetry creates many tiny UV islands, great for not distorting textures, but doesn’t work for lightmaps. So, dynamic lighting it is (until I solve for unwrapping clean UVs of these dense models for second UV channel for lightmaps), don’t see a problem with that if I’m maintaining frame speed, thanks for a sanity check if I’m missing something so far.
I’m trying to understand two things in UE4 about my meshes and how these relate to the lighting. I scale my cave scene with actual metrics before exporting fbx’s in meters. When I bring these into UE4 the model is stupid big, a point light doesn’t touch it cranked all the way up. I see how I can scale this model back down to sweet spot in a workable range for setting intensity and attenuation radius of so many point lights. Since for the moment I’m forced to use stationary lights, and until my GPU can’t handle it, is there any reason to not ignore the red X’s (which I understand to indicate a limit reached in ability for UE4 to bake a lightmap if set to Static). If it looks good and GPU is happy, can I stay this course?
Second question is regarding scale. Why would my fbx’s import so huge? Perhaps, my SFM app isn’t writing scale into the fbx the way UE4 is looking for? Does it even matter, when I’m having to scale the meshes to bring my lights into range?
My guess if you’re working in meters and the scale is too big then it might be making it 100x too big
As for lighting and performance, I say do whatever works well and looks good–the X for the stationary lights mean that more than 4 lights are overlapping which will mean that additional lights will not do anything.
If you want things to perform well on systems that aren’t as good as the one you’re working on it would be better to work on improving performance. Depending on the detail you might be able to reduce the polygon count and texture resolution, for lighting you would get better lighting and performance if you can set up your lightmap UV’s.
My content targets only high-end system specs, so I’m good there. I’ll not walk away from optimization, but that’s still down the road, main thing presently is for client to see something that looks great as is. I’m clear about position and attenuation radius of that fifth light, the one with the smallest attenuation radius gets the first X, but good to know I can ignore the Xs if I stay with dynamic lighting (again, in the short term for demo purposes). Just to be clear, the X only concerns static lights ability to bake for lightmaps, with stationary lights it’s dynamic lighting and whatever my GPU will support, what I see in Editor is what I get in packaging, yes?
As for scale, if I reduce to .1 on import the range of point lights for intensity and attenuation is marginally acceptable, .05 is too small (lights overly sensitive to the lower end), seems .075 might be the sweet spot. Is there a setting I could use in 3DS Max to ensure my scale stays true. My client will value being able to make measurements in this cave, something down the road, but I’d greatly value true scale. A final thought on that; considering the behavior of the lights w.r.t. range of settings, what might work best is if I could pick known points in the set in UE4 and set a distance to set scale. That way I could scale the meshes according to the lighting, but have the values for Location remain congruent with real world. I hope that makes sense. Thanks for your input.