Real life mapping camera perspective

Hey guys,

we are a team of Interaction Design students and we wanted to use Unreal Engine as a 3D game engine in combination with a Pepper’s Ghost installation ([Here is an example][1]) of that technique. We wanted to use it as an augmented reality layer for our 3D room where it is supposed to project information and particles into a real 3D printed room with walls

. Therefore we want to map the perspective of the 3D print as exactly as possible.

As our project progressed we saw some problems and now we tried some “calibration” with a simple DIN A3 Page with a chess pattern on it [(Video)][4]. But we realized, that with incresing perspective angle the engine ist distorting the proportion of each of the chess squares. We changed the FOV and tried different angles now. Is there any possible way to use a physically correct camera in terms of eye-like field of view and focal length? We want to maintain the reality perspective. How can we achieve the most correct perspective? I hope i could describe our problem.

Thank you for your help

As we progressed we found out that Unreal engine is using horizontal FOV, which explains the vertical distortion of the chess squares which should remain their aspect ratio (1:1), we tried to change the aspect ratio of the camera but it turned out to be a wrong end. Our setup now ist a little bit different. We are using the preset of the 3rd Person Blueprint but instead of a skeletal mesh, we use this altered cube with chess texture to turn around and alter the FOV while maintaining a changing Spring Arm which is adapting with this formula. So our current question would be:

Is there any ways to compensate the distortion through horizontal FOV or use vertical FOV, to maintain the square sized chess pattern?

Hi guys,
we’re having the same problem, have you come to any kind of solution?