XR (LED Wall) Stage Setup

Hi, I am doing work with an LED wall (currently flat, but soon to be curved) and am having difficulty understanding the roll of nDisplay in the workflow. Our goal is to use camera tracking to have the virtual camera follow the real camera’s movements and project the environment on the LED wall behind the actor. We want the camera to capture what it actually sees, rather than compositing the view in real time. I know there needs to be some sort of projection correction to compensate for the fact that as the camera moves, it is looking at a flat surface (the LED wall) at an off angle, which would lead to the image being captured to look slightly distorted. My main question is does nDisplay with its inner and outer frustum correct this distortion in any way? Or is that what software like Disguise takes care of? We are looking for someone who could consult our team on how to configure Unreal and nDisplay to work for our purposes. We have a Vive Mars Cam Track system and could also get access to Mo-Sys if needed. I look forward to any advice!

Have you apply Camera Calibration on Ndisplay?
It might be the reason why your image has distorted, you can find more details in this documentation.