You mean indirect cache? Yes, it works great with character, but it can’t be updated at runtime, and it can only work with lightmass workflow.
I don’t know what it’s called in UE4, in UDK it was called Light Environments, while it only works with static lighting it was a dynamic effect that updates constantly.
Yes, but then global illumination is my favorite area of research. And lighting is really the missing component from modern games. Sure animation could be better, but simulation tools that just require more brute force are already out there, nothing to do about that. Materials are already as good or better than offline CG had a decade ago, there’s quite good approximations of everything except anistropic materials running in realtime. Texture resolution and mesh quality is already beyond what was available a decade ago in CG, Gollum had 5k polygons in The Two Towers, main characters in games can reach over 100k polys now.
Even image quality, EG Anti Aliasing and whatever, are getting pretty good. But lighting is awful, and remains so, for example this was created years ago:
Half Life 2 models, and textures, and animations, but all composited correctly with nice offline lighting. Global illumination is hard, very hard. It’s easy to say “oh, you won’t notice this artefact” or shadow acne, or overly hard shadow edges, or that diffuse lighting is too low frequency to require a high frequency solution. But heck, just look at what path tracing can do for Minecraft of all things:
The truth is the biggest noticeably difference between Hollywood CG and games today is lighting. They can use many lights, or pathtracing, or etc. while games get to research, for example, a hybrid pathtracer/many lights solution (DFGI) and hope it will eventually look good enough and run fast enough for a single short range bounce only useable in the right environment.
Unfortunately, for that reason, most R&D people I’m acquainted with don’t do GI a lot. Not that I can blame them, re-writing motion blur for the fifth time is straightforward and provides improved and useable results. Trying out a new advanced lighting/GI method takes a long time and usually ends up with a result that’s too slow, or show too many ugly artefacts, or has too many restrictions on use, or a combination of those and more.
I know I’m late to the party but I wanted to add something here:
It is in fact a good solution for open world games. MGSV uses IBL probes to light the environments in a cheap enough way to be useful in last-gen consoles. They capture the cubemaps at different times of the day and then interpolate between them as time goes on in the game.
I tried to do this in CryEngine but you couldn’t blend between two cubemaps at runtime. It can be done in UE4 but you have to manually create all the texture assets which is a pain in the *** so obviously nobody does it.
The ARK have something like that http://www.twitch.tv/unrealengine/v/7165456 in the 46:00
Epic really just needs to improve it’s workflow, not to mention it would take much less new efforts than a real dynamic GI solution.
So, apparently it’s helpful when lighting openworld environment. UE4 must pick it up.
I second that.
in fact you can update one probe at a time every frame or every X seconds (you can already do that with a custom C++ class, without modifying the engine code).
they could go even further and subdivide the probe capturing function so that its routine is done over multiple frames, so that the engine, every frame, does a tiny bit of processing to recapture the probes little by little but continuously. this would lead to a low-impact solution that guarantees everything will be updated at least every X seconds.
to make it a bit more precise the probes could be updated more or less frequently based on a priority, which would be defined by how close the probe is to the player.
all this would allow dynamically changing lighting (i.e. time of the day) with even a changing skylight capture, as well as lighting that changes based on whatever happens in the game (drastically changing weather, closing all windows in a room, starting a fire in a city, generating new geometry, etc).
I think an extra option wouldn’t hurt, and then it’d be up to everyone to decide which system to use (just like right now you can decide between lightmass, LPV, DFGI or nvidia’s VXGI)
it wasn’t really a dynamic effect because the lighting was static. what the LightEnvironment system did was that, when you bake the lightmass, it would create small probes of captured light scattered in a grid in 3D space (only inside the LightmassImportanceVolume). the dynamic part was only that when an object moves it would get the nearest probe so it’d always have its lightint matching the surrounding environment.
but moving a light or recapturing these probes was never supported.
also I’m pretty sure UE4 uses the same system when lightmass is used
I would realy love to see this in ue4.
I think for UE4 Nvidia VXGI looks very promising: geregistreerd via Argeweb
I foresee that any precomputed GI, light probe, reflection probe techniques are useful… but only for the next five or so years and certainly in the meantime for XboxOne and PS4.
However for PC-level hardware eg. Nvidia Maxwell, Pascal and beyond, especially with VR, from 2015-2025 game engines will have to move to full realtime GI… Which I think they will, since lighting is the key factor to the feeling of realism.
You can pretty much get realtime raytracing with Octane and a huge number of GPUs: https://www.fxguide.com/featured/octane-render-realtime-ray-tracing/
Within 10 years the appropriate tweaks will bring that quality to realtime gaming on consumer-level hardware.
To sum up, I predict UE4 will be the last “non-fully-realtime-GI” Unreal Engine
UE4 Lightmass already produces (for a “gamer” like me) jaw-dropping GI (albeit not fully realtime GI etc): http://www.ronenbekerman.com/unreal-engine-4-and-archviz-by-koola/
UE4/5 with pure-fully-realtime-GI …Well, we can say bye-bye to the physical world. Just in the next 10 years. Very scary.
PS Don’t forget the advancements in CPU and GPU physics. As many have noted, it’s not so much the polygon count and so on but physics and lighting that “suspends reality”. Microsoft’s “cloud physics” indicates how this will be easily done on consumer-level CPUs and GPUs by 2025: Microsoft Cloud Gaming Prototype (Build 2014 Xbox One/PC) - YouTube
PPS Eventually “voxel point cloud data” game engines will be out too and that will add to the hyper-reality of life in 2025: - YouTube
Enjoy reality while you can, reality is going bye-bye
UE4 is growing massively , maybe in the near future we will get real-time gi
Realtime GI, be it distance field based, LVP or any other solutions will be great, but meanwhile improving the workflow with reflection captures would be help in many cases.
I think sometimes people underestimate what would pass for alright faked GI. Probably many games would even pick a fake GI over the real one for performance reasons.
Because reflection captures are already there, plus lighting the diffuse from captures seems to be partially working or being work in progress (if I remember correctly there is a console variable to turn on diffuse from captures), reflection captures are rather good and probably easy to upgrade as a fake GI system.
It was already mentioned, but it would help if captures would be updated with:
- diffuse lighting and separate slider for specular and diffuse intensity
- some automated system for updating the captures over designer defined time, propagating from the area around the players character (time could range from seconds even to minutes)
Environment probes look very good.
Something is better than nothing.
I think IBL from capture Sphere/Box for use with dynamic GI is a great alternative that we really need in UE4, just look how good it looks in snowdrop for example and it would be alot easier to implement compared to many other techniques, then we would have some to work wit until the new cone tracing comes back, which could take years sadly.
I am too all in favor of IBL probes. This approach is very flexible, adjustable. It’s also very good in terms of performance. As mentioned, interpolation between prerendered cubemaps can make it possible to do day-night cycle. Full-realtime faux-ray-traced solutions are cool and all but I’ve yet to see one that doesn’t affect performance significantly. Lightmass is really-really awesome at what it does, but it certainly imposes limitations (rigid modularity, lack of option for real-time iteration on lighting even if the final result is to be baked, RAM issues with big open spaces, rendering times if you lack a render farm etc.). There is a reason why there are so many engines now which are in favor of dynamic approaches.
I absolutely love Ue4(I can’t think of better engine ) but one thing that I miss is dynamic GI.
yes, 3m ago you didnt saw actual gameplay footages
I’d love to see this as an option - things like VR games benefit a lot from improved lighting, but can’t handle the performance hit from VXGI or the like.
That’s only for Translucent surfaces, the others should blend between probes.
VXGI struggles to hit 60 on my 980 let alone in VR It’s good but still needs work, it’ll be a long time before it’s really use-able in games I think.
Mind you, XboxOne is capable of doing realtime GI as well in some cases. In the new Gears 4 trailer, there’s a particular bit where the protagonists flashlight shines on a red cloth and it illuminates the area around it. Looks great.
Is anyone working on a solution for this? I don’t think there are any existing methods in UE4 that work. I know that the devs working on ARK: Survival Evolved have a working solution. I am also working on with a team on an open world game that really needs this. Our biggest problem is cave lighting and other interiors.