I have been outside discussions about realtime raytracing mainly because time is precious and I have been forced to look into it mainly because people who use my developed assets asks if it will work or not, so reading this thread makes me even more worry, but there are points I would like to state on my observations and concerns:
Facts on what my usage of the engine are:
- I have an asset in the marketplace Cloudscape Seasons which is a sky system using Raymarching in material with Alpha-Composite double sided and the cube mesh I apply the material covers the entire scene, so the clouds are rendered inside;
- I have an ocean being developed using FFT, with realism in mind, translucent material with tessellation applied in a quad-tree set of planes;
- I have another sky system using voxels, which is also heading towards realism and performance;
Facts about current implementation for realtime raytracing as in 4.23 preview 8:
- does not render reflections of translucent materials, so if you apply a translucent material in a sphere and place it in front of a mirror you won’t see the sphere reflected;
- it changes the material appearance of alpha-composite (as shown in the video) and somehow disregard the blending mode showing edges (this should be ok for translucency, but not alpha composite), is using refraction (not sure, but check the video) and generating spherical artifacts;
- the exponential height fog placed in the scene and set to volumetric fog is shown in the mirror reflection, but changes on color are not being reflected on the mirror image, and I can’t see the difference when volumetric fog is disable and re-enabled, which is odd, the fog is removed if I make it invisible or remove from the scene, so at least part of the reflection is there, which gives me hope.
- translucent materials like water, what is necessary for it to work?
Considerations:
- while in Beta, I understand the feature can be partial in several things;
- it is hard to use it as it is, and I am very pleased to see that all the demos shown, up to now, could show it working and not being dependent on the problematic features, but real case scenarios will be full of dependency on the features not ready yet, which will give a huge frustration for everyone putting their hands on it right now;
I have a film, which would be great to have realtime raytracing working, not even concerns me the realtime part since I will use shots from each frame, but the features missing completion is a major concern. It would be great to have a switch at material instance level to allow the material instance to behave as in raster mode, but I know this is a pain. I really hope the feature, when out of Beta ahead, will work properly in every scenario.
The video where I show what I am talking about here:
watch?v=BtYWxyPI-iY