You mean the negative attitude that got formed for being ignored for +1 year, n threads and 17 pages?
Anyways, I’m glad to know that we finally have a confirmation from Epic saying “Can’t promise anything”. So now we know they’re well aware of it and I can now stop bumping this every now and then.
Kalle-H contributions are going towards his/her specific needs at the moment, so it is not a broad solution for every issue derived from the main problem. I do agree the contributions are most welcome (better than nothing), but what these contributions tell me is that things can be improved (if not fixed) if there is someone engaged into this. Epic’s acknowledge, even later, is also better than no acknowledge at all after all the waiting we have giving.
We all want to deliver state-of-the-art content and this basically depends not only on the criativity, effort and talent we have at our disposal, but we also need a counterpart effort from Epic. Im really not going to follow-up this thread anymore, because my day goes a lot smooth than being remembered everytime how bad the situations is.
I don’t see a lot of overreaction. It would be overreacting if we would go crazy after one month without a reply from Epic. But after over one year, I think it’s absolutely reasonable that we get sour, especially because not few of us are heavily affected by this problem, up to the extend that we cannot showcase our work.
Still, I don’t think anyone really wants to burn Epic at the stake. All we want is a statement that acknowledges why they ignored us for this long and a promise that they will be working on fixing this in the near future. I think that’s only fair.
And think about it this way, you want to have a good relationship with your community, especially if you’re Epic and you rely heavily on it. I personally might not be an engine contributor, but I still want to help the community and offer free blueprints and assets. And, you know, encourage more people to use this engine over Unity or Lumberyard. However, stuff like this makes me want to not come back here anymore.
I don’t want to comment on the one year wait or the community management on these forums anymore, I think everyone is getting tired of it by now, just want to apologise to@Krzysztof.N for doubting the authenticity of his account, hope you understand that without the badge there was no way of knowing.
Yeah, let’s stay professional here. I’m very happy with Krzysztof’s feedback on this issue, and I’m confident we’ll get an update on this. We really don’t need personal attacks.
Oh come on, I just praised Kalle-H, I didn’t say Krzysztof is a bad person, don’t be so sensitive. Didn’t you read my post where I welcomed him to the community?
But, if that was hurtful towards him, I apologize.
#if (!MODULATED_SHADOWS) || (FEATURE_LEVEL >= FEATURE_LEVEL_SM4 && !FORWARD_SHADING && !APPLY_TRANSLUCENCY_SHADOWS)
FGBufferData GBufferData = GetGBufferData(ScreenUV);
#endif
#if !MODULATED_SHADOWS
#if USE_PCSS
#if SPOT_LIGHT_PCSS
float Attenuation = GetLightInfluenceMask(WorldPosition) * saturate(dot(GBufferData.WorldNormal, DeferredLightUniforms.LightPosition - WorldPosition));
#else
float Attenuation = saturate(dot(GBufferData.WorldNormal, DeferredLightUniforms.NormalizedLightDirection));
#endif
#else
// Both spot and directional light use same shadowing code. Select proper direction. No need to normalize.
half3 Dir = DeferredLightUniforms.LightInvRadius > 0 ? (DeferredLightUniforms.LightPosition - WorldPosition) : DeferredLightUniforms.NormalizedLightDirection;
float Attenuation = GetLightInfluenceMask(WorldPosition) * saturate(dot(GBufferData.WorldNormal, Dir));
#endif
BRANCH
if (Attenuation > 0)
//Shadow sampling code.
Just tested some early outs with shadow sampling. With 4 stationary spots lights these early outs saved about 5.5ms with my laptop. Can’t really make pull request because it’s not compatible with PCSS_SHARE_PER_PIXEL_QUAD used in ShadowPercentageCloserFiltering. Can’t have divergent branching when using explicit derivates inside.
Normal pointing towards light and pixel inside spotlight volume tests are equally benefical in my test scene.
Normal pointing light test also combines ShadingModel unlit test with this PR. https://github.com/EpicGames/UnrealEngine/pull/4441 (unlit pixels do not need normal and it’s defined as(0,0,0) )
But without that optimization it’s might be beneficial explicitly test if pixel shading model is Unlit.
Also for some reason subsurface shadows are calculated for all subsurface models but not all them use subsurface shadows. I am not sure about others but I am sure that MATERIAL_SHADINGMODEL_SUBSURFACE_PROFILE is not using them. For cinematics these pixels might cover large sreeen area.
GeForce GTX 960M. It’s not surprise that UNROLL is faster on that kind of loop. I have never encountered simple not nested loop that would be slower with unrolling. Sometimes benefits are not worth the additional code size but in this case it’s quite clear win. I have to test this with my desktop GPU also when I get to office.
My directional light have default angle(1 degree) and I have tuned r.Shadow.MaxSoftKernelSize=18. When kernel size get’s too big then cache misses start to dominate performance cost.
It is not the fact that unrolled loops are faster, that surprises. It is half the render time difference that is uncommon. I don’t recall seeing a measurable gain outside of 20% in recent years, even with fetch-heavy loops. Not sure why the loop was not unrolled by compiler either, to be fair.
But yeah, considering that UE4 uses quality presets, that define loop iteration count at compile time, there is absolutely no reason not to unroll. As to code chunk, I think the inflation of the shader size and compile time is incomparable to speed gains in shadow filtering, so can’t be even regarded as a downside.
There is diff between unroll and not. https://www.diffchecker.com/43fcZ35Z
I use 32 samples for both search and pcf loops. Shader is quite big 1662assembly lines but performance is quite good. It’s just 2.2ms slower than non soft shadows.