Yes, for now you will have to rely on the shadows that are output from the function for that. You can plug in the light vector from your sun using blueprint or material parameter collections. There are obvious limitations though such as other shadows will double shadow and all lights will also be affected.
Tried it for a bit, seems pretty good. Too bad the terrain itself wonât be able to cast shadows on stuff. But I guess itâs better than nothing. Unfortunately, the whole shader seems a bit too expensive considering is supposed to run on the PS4. Will have to see if itâs worth the performance hit (and how big that one is in the first place).
Someone has an example node to put in Render Shadows (Occlusion Mapping), Light Vector, Shadow Steps and shadow Penumbra. I am in doubt about WHAT put these parts.
There is one thing I have a question about, though: I know we switched to using shared texture samplers not too long ago, and I was wondering if the custom code in the parallax occlusion can have a shared option, or use a shared sampler by default. I have a massive material on the marketplace that requires a lot of textures for a variety of different purposes, and I really would like to take advantage of it.
Render Shadows - Bool. If true, shadows will be outputted. If false, I believe the output from shadows will return a constant value of 1.
Light vector - 3 Constant. The direction of the light angle, normalized. 0,0,1 will have the light from above. 1,0,0 will have the light from the side. -1,0,0 switches sides.
Shadow Steps - 1 Constant. Number of steps to consider for shadow calculations. You donât need to be too high, but the higher the clearer. And more expensive.
Shadow Pneumbra - 1 Constant. 1 is a sharp shadow. Higher is sharper. Lower is softer. 0 is completely soft.
Good nodes. Iâd only add that in most places you should specify lightvector using a Material Parameter Collection. Then just make a blueprint that sets the MPC value based on the âforward vectorâ of the sun light. For pointlights you need to instead specify the lightâs location and then do worldposition-lightposition in the material, normalize that and plug it into the LightVector input. Obviously handinng multiple lights will not work so well and is one of the limitations of method. One day pixel depth offset will get you âfreeâ dynamic shadows from POM. That is probably several months out though.
Regarding shared texture samplers, I have not had time to look into that properly. I want to think that it should be using shared wrap if the texture is set that was in the texture properties but I would not be surprised to learn that the code as it is does not support that yet. I will try and dig deeper on that soon.
Is it possible to get working with decals? I want to make a bullet hole that appears to have depth, but the pixel depth offset input is disabled when I set the material domain to deferred decal. Is there any way around ?
So are there going to be a content example or sample project? as much as i love UE4 and its model.
I hate to a passion how things are done half way.
We have threads and posts asking how to setup POM. I tried on my own and couldnât set it up.
You would think that a sample project will release with each of these new features. (HFGI, DFGI, POM, Mesh Distance Field Materials)
As a guy who works 50 hours a week while being a full time student. I donât have much time but a few minutes to spend on Unreal when i get a (till december.)
That one hours (till i graduate in Dec) shouldnt be spent figuring out WTF is going on.
It should be plug and play at the very least.
I appreciate Kite Demo and the other demos. but my goodness, you guys need to start creating sample projects for each rendering main . It will cut down on 20% of posts that are simply asking how to setup a rendering . It will also help in troubleshooting.
I Appreciate you . but a sample project with one or two materials will go a long way.
Right now I am unable to use because of extreme deformation at glancing angles close to the camera (pressing your face against a wall and looking down i). Any way to mitigate effect?
Also did the manual texture size vs 1 / manual texture size change make it in the final 4 9?
You can mitigate that effect by using fresnel to dampen the height of the parallax. That technique is called âhorizon flatteningâ and it is a common way of reducing the artifacts or reducing the number of steps required for things to look good at glancing angles.
You can simply use absolute(CameraVector Normal) as the lerp input for the height value. Use scalarparam âHeightâ as B and scalar âFlattenedHeightâ as A. You can use power on the product to tweak how much of a glancing angle is affected.
Also there is an alternate version of the trace vector calculation that give better results at glancing angles but when I was using it it had another artifact where it didnât reach into deep corners at glancing angles. Basically that method was just using the raw tangent space camera vector divided by stepsize and scaled along Z. I may try to make that another option at some point. The thing that method did better was it never traced off to infinity at a completely glancing angle like the current version does (essentially Z offset is same for all angles).
And yes the manual texture size problem was fixed in time thanks to people in thread pointing out the in time.
So is it impossible to use with decals? I seem to be able to achieve a similar effect using a plane, but it has to float away from the surface to accommodate for the depth of the displacement.
It doesnât really make sense to use pixel depth offset with decals when you think about it. Pixel depth offset is designed to reveal accurate intersection with other geometry. So if you have a solid floor or wall and put a decal (or floating plane) and then use pixel depth offset to push the pixels down, by definition you will reveal the underlying floor or wall, unless you float the decal or plane away from the surface as you found.
If you want a decal that can appear to make a hole I think that will be some completely different like depth only writes or something. I am not sure what it would be honestly.
What do you actually need the pixel depth to change for if its a decal? It will only be noticeable if other objects need to actually appear to go inside the hole for some reason. It should look just as good without pixel depth offset.
btw I just tried pom on decals and it works until you rotate the decal at all. That looks to be because decals do not have a valid tangent space to transform into. I will have to see if it is possible to derive the necessary tangent basis. To be clear, I am talking about getting basic parallax to work with decals, not pixel depth offset.
Good day! I have a question about current implementation of POM in Unreal Engine 4.9.
Is it possible to use TextureSamplerParameter2D instead of TextureObject as a Heightmap Texture (which is currently T2D, so it is not compatible with float)?
As I mined a bit in HLSL code in ParallaxOcclusionMapping I found that it is possible to use float input (for example, only Red channel, because it is easy to fix in any image using Photoshop or Substance Designer) and not to use HeightmapChannel (V4) filter.
Thank you.
Thank you, ! I think I misunderstood exactly what pixel depth offset was for. I suppose I specifically donât want there to be intersection with the surrounding geometry, but it sounds like thatâs impossible or at least complicated to achieve. What about decals that extend outward from the geometry theyâre applied to? Iâm imagining muddy footprints and other debris on an otherwise clean floor, for instance.
Huh, I used 's POM for Decals and it worked fine as far as I remember (maybe my memory lies to me). It was quite a nice effect and I liked it since it enhanced the look and feel of impacts and a few other decal based .