I suspect Lumen will support its own screen traced + raytraced translucency implementation in the future. Tracing a refracted ray is not substantially different than tracing a reflected one. Now that lumen supports multi bounce reflections I suspect were a step closer to refraction.
But I also assume this is a fairly low priority, since raster translucency is fine for most cases.
If you want inverted refractions, you can write a custom shader that samples the scene coordinates after inverting and offsetting them. This is how it’s been done for ages. For example The Last of Us had it 10 years ago, although I’m sure it’s probably been around as long as screen UV based refraction in general, even if most games don’t bother (and most gamers wouldn’t notice). You can look to the existing refraction code for guidance.
Or you can take the easy - less integrated way and just use a scene sample node with refracted UV inputs. Doing the latter will prevent cumulative refraction of layered translucency (because translucent objects don’t show up in scene samples) but hopefully you’re not layering translucency anyway for reasons I’ve already mentioned. If you go this route you’ll still want to inspect the built in refraction’s code as there are some adjustments being done beyond simply bending UVs by normal direction to compensate for screen attributes (aspect ratio/scale/etc) and more.
Thanks for the great suggestions! I work in automotive and the cases where this issue pops up for me most often is in headlights/tail lights/etc. Those areas almost always have layered glass objects, often 3 or 4 layers deep, and getting those to render well outside of path tracing is challenging. Do you think your second suggestion would do well in this scenario? I will have to deep dive those topics a bit to understand them.
Getting inverted refraction working with layered translucency would best be achieved by altering the existing refraction code. This is because the existing code can accumulate all of the translucent layers and combine their offset.
If you use the simple method of just sampling the scene color, other translucent objects will be invisible in this sample making multi-layered refraction impossible.
I know Path Tracer is not supported but still it’s working, somewhat. Every object is rendered with a metallic blue reflective material. It’s still useful to do ground truth comparisons about the shadows and lighting. Does anyone know what’s the name of this default material Substrate is using to render all objects ? Is that material modifiable ?
Hello everyone, how to do volume material via Substrate? I use a special node for fog, but nothing works.
Works for me. The preview has never been able to preview volumetric materials, that’s nothing new.
You need an exponential heightfog actor in the scene with volumetric fog enabled (again, not a new requirement, always been the case). If that fails you should make a new project without substrate enabled to reproduce it so that you can verify that the problem is specific to substrate volumetric materials.
Video
If not, probably best to make a new thread
Refraction doesn’t seem to work when using Unlit_BSDF in 5.2. Unlit translucency supports refraction in the legacy material system… Is this a bug or am I misunderstanding how the Unlit_BSDF node is supposed to be used?
(This is using IOR but none of the refraction methods produce any result)
Hello.
I wanted to know if it’s possible to give opacity mask to thin translucent material while using substrate. I seem not to find any way or node to work this out.
Any help is appreciated.
Thanks.
@Arkiras Thanks! This looks like a bug as I unlit transluceny should support refraction. I’ll add that to the list of things to fix.
@Amin.castellan OpacityMask is only available for Masked material, with Substrate ON or OFF. So not sure what is your use case. (unless I forget something)
Thanks for your reply.
On the subject of Refraction, is there any chance we could get some sort of control over chromatic dispersion through translucent materials? I’m not sure if that’s the correct term, but there’s an example of it (courtesy of ArtOfCode) here: Shader - Shadertoy BETA
Basically when you look through the medium, the refracted scene has a sort of rainbowish fringe around the edges. I’m not sure how physically realistic this is, but it looks cool :o
As far as I know the standard way of dealing with this in Unreal is to just use emissive “opaque” translucency where you sample the SceneColor 3 times (R, G, B) and offset the coordinates slightly for each color channel, then merge them back together. This works well enough, but since translucency is not rendered into SceneColor, any translucent surfaces behind the object will vanish when looking through it:
Sampling the SceneColor is also one way that we used to do blurred/frosted glass in Legacy materials. Now with Substrate, we finally have a better solution for that. It would be cool to have a better solution to chromatic dispersion as well, if possible
The fake method you describe is actually pretty close to physically accurate, with a few catches. Specifically, the wavelength of a light effects the IoR of the medium.
IoR is higher when transmitting blue light than red.
One pitfall is that at least for physically accurate renderings, the amount that wavelength impacts dispersion is not constant across mediums. So an accurate dispersion shader would need an IoR and a coefficient.
One example of how different types of glass or transparent mediums are rated for dispersion is with Abbe numbers. An Abbe number, much like the RGB fringe method, looks at the IoR at 3 wavelengths to approximate the dispersion, then just kinda fills in the dots in between. The fringe method doesn’t have a way to fill in the dots without doing loads of extra samples, so the illusion breaks down with high dispersion.
The lower the Abbe number, the greater the chromatic aberration, and generally the lower the optical quality of the medium - a very important factor when constructing things like lenses for cameras and eyeglasses. And on the contrary, when making chandeliers or decorative leaded glass (like for doors and windows) very high dispersion is often desired.
Since some effects are not rendered into SceneColor, is there any way to get the complete image from the previous frame (without any slow rendertarget or capture involved)?
Like just storing it in a buffer and then accessing it here to get a complete image, that contains all effects for better reflections and refractions.
I’m not sure actually. I think you’d need to render the translucency without lighting, otherwise you would end up with ghosting/accumulation as the previous frames lighting gets added to the new one… but I don’t know, I’m not a rendering engineer or even a tech artist, just a regular artist :0
I get the feeling, that this is something, thats not that easily achieveable ^.^ but yesterday after seeing your attempt for chromatic abberrations, i remembered, that i once had “created” an experimental material function (more like copy-pasted for a Blender forum ^.^), that uses correct IOR values to create correct Fresnel effects. And not only the simple ones for plastic, but the complex IOR for metals, with separate n and k values for red (650nm), green (550nm) and blue (450nm) → 3 wavelenghts = 3 different sets of values.
Those here: Refractive index of METALS - aluminium
And turned out, that also works with substrate, since in some foresight, i also added an IOR to specular output, that i now can connect to the f0 input
But that got me thinking, that the real problem, of why we don´t get chromatic abberrations might be simply, because UE still accepts only ONE IOR input for the whole material, where we would actually need between 3 separate inputs: one for red, one for green and one for blue -for a simple material.
A complex one would probably need 7, one for each wavelenght of the rainbow colors
And a blend mode for mixing all together, that does not simply add them together, but only uses the higher value from the inputs. Like if you get Color A with R 200, G 120, B 50 and Color B with R150, G180, B80, then the resulting color C should be R200, G 180, B80.
I guess, i will experiment with that and layered materials at weekend, if they work with substrate, because those are currently the only ones, i can think of that could mix everything together for a regular material.
Second option would be a post process material, that somehow mixes those 3-7 togehter for the transparent glass areas.
Ok, i utterly failed to create a material, that can combine 3 separate transparent materials, of which each has it´s own IOR in UE. ^.^
So atleast here is the effect i hoped to achieve rendered with Cinema4D: One gem rendered 3 times with 3 different materials (one for each color, and each color with a slightly different IOR), then the layers got combined.
And as expected, you get your chromatic abberation, and it is slightly better, because of how much the 3 colors gets separated, now also depends on the material thickness and shape.
The 3 layers, and the Gem, if anyone want test it with UE. Maybe one of you can create such a material
OR Epic could just decide to give us an IOR input, that can compute a complete Vector3, and not just the first channel from a Vector3
Gem.fbx (332.4 KB)
Is substrate material compatible with city sample? I turned it on and my UE just simply crashed and unable to open