[Twitch] Subsurface Scattering and Ray Traced Soft Shadows Demos - Oct. 16, 2014

We recently had it on the task list but something else popped up and moved it down a bit. We haven’t settled on the method. We know we want 2 shifted lobes of specular anisotropic shading, consistency with other lighting models, energy conserving properties, moving geometry and some form of antialiasing. The right technique often depends on the hair type (blond -> better translucent, dark/thick -> masked is ok, flat -> opaque), view distance (near-> tessellation, medium -> sheets, far-> Masked) and the environment (foggy, depth of field -> need depth, many lights -> simplify lighting). Going for masked might mean we need a post process to blur the noise in the hair direction.
It’s likely we need multiple techniques combined to get the best results.

I will look at it soon - might be simple. Keep in mind VR likes resolution and high stable frame rate. Especially if you get near a face you might have to pay a higher cost (texture lookups jump through more memory cause more memory bandwidth).

The backscattering is not yet implemented but that is on the near term task list. We already have similar code in the Subsurface shading model but instead of a large filter fixed kernel we could use a randomized kernel for better performance. The ScreenSpace subsurface scattering can blur the noise a bit without any extra cost.

I want to finish the forward rendering code for UE4.

I consider lighting mostly solved (quality, performance, large light count), texture streaming can also be considered solved (minimal streaming hitches, quality near and far, large texture sizes, sparse textures).
Still in both areas we can improve further.

Antialiasing is much less of a topic because we have very good TemporalAA which works even better with high frame rate. Still there is some problems and I feel using the hardware MSAA with some forward rendering
might because a better option in some cases (VR).

I would like to get shadows solved (fast, scales well with geometry complexity and light count, penumbra, translucency). I did some experiments (http://kosmokleaner.wordpress.com)
but I haven’t tackled the penumbras yet (many rays is too costly).
Daniels recent work in UE4 does a very good job on that. If we get skeletal meshes working and get it a bit faster with that we might have it.

I would like to get level of details solved (fast, simple, works on many mesh types, skeletal meshes, transitions, unique texturing, memory efficient). I have some experiments but a few more problems need to be solved.

, thank you :slight_smile:

I figure even if it’s spendy it will be extremely worthwhile for VR - I don’t mind cutting triangles, complex shaders, etc etc in order to get nice skin.

Thanks for answer!

I have one more question, for anyone able to answer it.

Any plans on improving Atmospheric Fog ? It’s all nice but there few things that bother me, from most important to least:

  1. Sun- Let’s just be honest about it. It looks like **** (;. No matter how high brightness you set, it still looks like flat disk, that is just brighter on the outside. It looks like it is missing HDR.
  2. Night. I mean separate settings for night with moon. I managed to exploit some settings, to have dark sky with bright moon, but that just feels like ugly hack.

Thanks for a great stream folks! Given my background as a lighting artist, it was especially interesting to hear what you’re doing with soft shadowing. Very cool.

However, I was upset by 's admission of stereo (thus VR) being a second-class citizen. I’m sure that’s a practical truth, as there are many fewer VR folks than 2D folks in the mix. But I can’t help but pipe up, that the only reason I’ve joined the UE4 subscription is for it’s support of VR. For the first time since April, I’ve cancelled my subscription. I don’t say that to be a petty child, though it sounds that way. I say that because it is the practical thing for me to do.

I was the first person to report that the new SSS wasn’t working in VR. I posted it in forums and in AnswerHub probably 2 days after the 4.5 Preview was released (2 weeks ago?), so it was annoying to hear that didn’t even know it to be a problem.

Given that VR isn’t going away, but will become a larger slice of the pie in upcoming years, I would ask that some testing time be applied to making sure new features work for it. As he said, it may be a simple oversight. So why not test and fix said oversight before shipping? Especially since someone (passive-aggressive-me;) took the time to report the problem 2 weeks ago.

Also, 's quote earlier in this thread, “I don’t think we will spend time on that, dynamic lighting methods are not a good fit for VR where you need super high resolution + 90fps.” This was regarding DFAO, but I’m not clear if it is also regarding SSS in VR. I would ask that you let the users decide if a feature is too expensive for VR. If it is, we won’t use it. But if we are demo-ing on specific-built hardware (say beefy machine for a conference), we may have very good reason to use expensive features in VR. Please don’t make such assumptions for us, unless it is completely untenable.

Sorry for the rant. I know you guys are very busy, and have a LOT to take care of. I just had to get that off my chest, and make it clear that this was quite disappointing for this particular user. And , I’m sorry for singling you out. I don’t want to come off as attacking you. That’s not my goal, I just have my own biased concerns.

Andrew Weidenhammer (aka - 3dLight)

Bloom in directional light properties overcomes that problem but then there is that weird flickering issue with bloom. We either need a better bloom solution or something else in atmospheric fog itself.

Hi there.

I saw in the stream that there is the distance field debug view so visualize the distance fields. Is there a way that we can write a shader to volumetrically trace though the distance field to render volumetric clouds in game? The visualisation looked pretty fast so just wondering if it was possible. I’m going to have a look and see if I can find the relevant code in the source and pick it apart.

Distance fields provide the signed distance to nearest occluder at any point in space and this allows us to efficiently solve occlusion cone traces from any point in the scene to any other point. This is super powerful and allows all kinds of dynamic lighting techniques previously impossible.

I dont’ think we have an plans to work on that atm, but thanks for the feedback and letting us know what needs improved.

The reason it is second class is just due to time constraints and the relative importance vs other UE4 use cases. There are effectively infinite high priority tasks that we have to do. But even so, I think our VR support is pretty good and it is really only new features that lag in VR support.

It is a new feature and just the work to get it out the door without testing with VR is immense.

A not-yet-implemented feature isn’t an oversight or even a bug, it’s just a not-yet-implemented feature. Support for VR of every rendering technique in the engine is a feature, it requires special work to get them to cooperate.

Distance fields represent distance to nearest surface, so they can’t be used to represent volumetric primitives which have varying density throughout (although it could be used to represent the bounds of the volumetric data, to accelerate rays going toward the volume, just not rays inside the volume). For volumetric data standard volume ray marching techniques are needed and it costs a lot more than tracing through a distance field where you can skip empty space.

It takes about 5ms on a fast GPU at half res. It is barely optimized so could easily be quartered in execution time. There could definitely be some interesting game visuals from leveraging these distance fields.

Search for VisualizeMeshDistanceFieldCS.

Thanks for answering. Fair enough about VR not being first on your list, but I suppose my point was that I feel any new rendering feature should include VR support before it is released. If this is not the way your team wishes to treat VR support (this is fine, I understand there is great value in releasing to most users early), then you should properly warn customers of that. I think quite a few folks have jumped into UE4 because it DOES have great VR support. There’s no denying that, and I thank you for it! But if a new feature isn’t tested, or shown not to work in VR, I think folks would appreciate a warning, so they aren’t subscribing for heartbreak;) Am I complaining that I wasted 20 bucks? Well personally, no. I think the upgrade is huge and more than worth it, but I can’t help but think going forward it would be nice for more communication about VR support on new features.

Andrew

Out of curiosity. Distance Fields are stored in GPU memory or in general memory ?

Page 153. Using SDF for diffuse GI. It’s bit old, but might be worth reading.

Edit:

Paper about optimizing sphere tracing performance.