Question about LPV

I take it no Mantle plans then? Guess I don’t blame you, a one platform, one GPU only re-write wouldn’t be the most time efficient thing in the world; though you are using Metal for the newer iphones yes?

As for distance field AO, it looks surprisingly nice! Some directionality would be greatly appreciated, but I’m not sure how you’d do that.

It’s not like you need to rewrite everything. There is abstraction layer between API and engine.
So rendering engine do not talk to API directly, but trough abstraction interface, so it makes it easier to add support for new API as they come.

As for Metal. Well, you need to compare how much there is AMD GPUs and how much iPhones/iPads ;).

At this point I’m more curious how quickly we can get support for DirectX 12 (I mean after it will be released), and more advanced optimizations for OpenGL 4.4+.

Or OpenGL5, when it eventually come, and hopefully solve OpenGL problems (ancient single threaded API, zero standardization across drivers).

@Xordoc

The problem is not adding functionality to rendering, the problem is maintaining it.
You can add super awesome Dynamic GI solution today, but it might be broken for next release of engine, because there was some underlying rendering changes and/or new functionality added, which will either break your solution or cause it to do not look that good.

I would love for community to tackle integration of some technology we can read in papers, the deep G-buffers solution would nice to complement some more large scale GI, just like SSAO complement DFAO.

Unfortunately I can’t be much of an help, I can read and find paper and even understand something here and there, but implementing it in engine is far beyond what I can understand :D.

Thing is LPV is relatively cheap also I’ve yet to hear people say CryTek release ugly games with bad lighting. Lightmass as it is right now, is simply too slow from a development perspective with masses of terrain. Not only that outside of LMIV’s the results aren’t great either. Not to say that Light mapping or pre-comp GI (usually radiosity) which uses directional light ray tracing isn’t better because it is (much better).

Any octree based spherical GI is too expensive to run, radiosity has it’s pro’s and cons.

I’d be very interested in an additional complete IBL solution like Marmoset for Unity, it would just rock in UE4. It’ll be intruging to see how well Enlighten does in Unity, I’ve still got a couple of subs active until Jan 2015.

There’s still better options for dynamic GI than LPV, and for something like what Crytek does, it’s more about the high quality assets than what the lighting is doing, because the lighting is pretty limited in capability compare to what people actually want to do with it.

I’m watching this solution to see what becomes of it: http://forum.unity3d.com/threads/coming-soon-spectragi-advanced-real-time-global-illumination-for-unity-from-livenda.247427/

And hopefully Nvidia GI will turn out to be good.

Actually I’d put CE’s graphical prowess down to it’s shaders and boat load of tasty post plus extras like RLR, helped by it’s double bounce LPV spherical lattice cascaded GI system and as a cherry on top IBL ;). But I agree LPV isn’t amazing being honest, I know radiosity works very well with Raytracing because I’ve put it in an engine.

As for artwork, the mobile demo looks amazing (UE4) and it’s nothing more than simple shapes (bar the statues) / good (but basic) shaders and great lighting from lightmass. I might be old school here but I still consider lighting and shaders tech as opposed to artwork…

That Livenda stuff is confusing, I thought it was fake at the beginning now it seems to be taking shape. Which is awesome, but still no real details on what it’s all about.

I thought the Nvidia GI works were DX11 dependent?

Which is why Epic should try again, but with 3D textures and a novel compression scheme to save memory and parallelise sample access at the same time. No jumping around all over to get samples is much faster. :cool:

Radiosity also has it’s… well it’s cons are “lightleaking unless you have an insanely expensive solution”. So I don’t see it.

A realtime relightable environment would be the zeitgeist of the moment. You can go and buy Enlighten if you can afford it, which can do similar. Or Epic can go through and massively re-engineer its current offline setup. I’m imagining a hierarchy of cubemaps. If one could get, say, 0,0,0 as a color value to instead default that part of the cubemap to the next “level up” in the hierarchy, then one could picture as such.

Local cubemaps of say, a 10 by 10 meter room, with windows out clearing to a cubemap outside of a 30 x 30 meter area grabbing general larger structures, which when cut off clears to a 1km x 1km area cubemap grabbing the tallest buildings/general area, then a 20 x 20 km area grabbing distant mountains, and then finally the skybox cubemap. Asynchronously update the cubemaps, and you could have a slowly changing time of day/weather. Just a thought.

Sounds cool, last time I used Enlighten was 2012 so I’m reserving judgement of its latest iteration until I can see what’s going on it Unity’s house (leaking as mentioned etc.). Cubemap idea is neat, hmmm.

Anyway, I do think we need to come up with a solution with dynamic GI even if LPV isn’t the answer. Or even if Epic isn’t the one doing the engineering, a community project would be cool. Plenty of tech demos out there to start a base with :)…