I’m jumping into Unreal after many years working with several C++ frameworks and Unity, basically because of RayTracing and Virtual Production.
One thing I dream for a long time is to work with a true RayTracing camera, for using in Fulldome.
I have made a Fulldome mapping software, Cinema 4D camera for rendering, some pluginsfor live performances, and a crappy Unity plugin that needs to render a cubemap and stitch into the final frame.
True interactive realtime action is only possible with a true Raytracing camera, and I feel we’re almosth there!
Looking at the Engine RayTracing shaders, I quickly found [FONT=Courier New]CreatePrimaryRay(float2 UV) inside of [FONT=Courier New]RayTracingCommon.ush, that’s the key. Changing it to convert the viewport UVs to Azimuthal Equidistant (fisheye/fulldome) or Equirectangular (360) rays was straight-forward, but didn’t seem to affect the game’s camera, either playing on viewport or when lauching
But I just found that my rays does work, only when I change the view mode to Path Tracing or RayTracing Debug, as you can see below.
So my question… don’t the gameplay camera use the raytracing shaders? How can they be different?
Since I’m an Unreal noob, I also don’t understand why the third person camera that is spawned in the scene when I hit Play does not show the same content as teh displayed on the screen.
There’s some camera level/abstraction/trickery in here that I’m not grasping…
What I want is that the actual game camera display the same projection as the PathTracing/RayTracing views.
Maybe there’s another shader function somewhere that converts the Frustum to Rays?
Before anyone asks, my project is properly configured for RayTracing, following the docs, and I have a RTX card.