camera perspective mapping in real life

Hey guys,

we are a team of Interaction Design students and we wanted to use Unreal Engine as a 3D game engine in combination with a Pepper’s Ghost installation

. We wanted to use it as an augmented reality layer for our 3D room where it is supposed to project information and particles into a real 3D printed room with walls

. Therefore we want to map the perspective of the 3D print as exactly as possible. As our project progressed we saw some problems and now we tried some “calibration” with a simple DIN A3 Page with a chess pattern on it (I will post a video soon). But we realized, that with increasing perspective angle the engine ist distorting the proportion of each of the chess squares. We changed the FOV and tried different angles now. Is there any possible way to use a physically correct camera in terms of eye-like field of view and focal length? We want to maintain the reality perspective. How can we achieve the most correct perspective? I hope i could describe our problem. How would you approach that problem?

Thank you for your help

The easiest way to do it is to render the virtual camera where the user’s head will be in real space, and then to mask out the area surrounding the virtual “window” into the cage so your projector just shows the area that’s the window into the cage.

The harder (but more correct way) is to change the projection matrix of the camera in real time to match the window into your cage. Here’s an example in Unity code (but the math should carry over):
http://en.wikibooks.org/wiki/Cg_Programming/Unity/Projection_for_Virtual_Reality

Either way will require head tracking (or a relatively fixed head position) to look correct.

If you want it to not require head tracking, just lay the reflector glass flat on top of the model and make the UE camera top-down orthographic. Unfortunately, it will look a little… 2D.

Head tracking with the 45 degree reflector will bring it to life; the video you linked can get away with it because 1) it was terribly aligned to begin with 2) the camera doesn’t move so you can’t see that everything is confined to the plane.

thank you for your helpful answer. we also posted in the answerhub, which left us really desperate, because nobody could relate to our problem.
Yes, the next step would be headtracking, but firstly we wanted to check the overall perspective. As we tried to fix the perspective problem, we realized, that UE ist wokring with horizontal FOV (the reason why it ist distorting the chesspattern and why ists not maintaining ists aspect ratio).

As we progressed we found out that Unreal engine is using horizontal FOV, which explains the vertical distortion of the chess squares which should remain their aspect ratio (1:1), we tried to change the aspect ratio of the camera but it turned out to be a wrong end. Our setup now ist a little bit different. We are using the preset of the 3rd Person Blueprint but instead of a skeletal mesh, we use this altered cube with chess texture to turn around and alter the FOV while maintaining a changing Spring Arm which is adapting with this formula. So our current question would be:

Is there any ways to compensate the distortion through horizontal FOV or use vertical FOV, to maintain the square sized chess pattern?

This topic is way over my head but talking about FOV distortion, maybe this helps: Engine Features Preview 11/14 - Announcements and Releases - Unreal Engine Forums
You can activate it with r.Upscale.Cylinder 1 console command(engine version 4.7 and above, afaik)

Thank your for your answer. The upscale.cylinder command does not help

Maybe by manipulationg the camera matrix and the viewing frustrum we could remove the horizontal/vertical distorion and simulate a nearly eye like camera.

Is it possible to manipulate the frustrum at all?

Hey zalo,

How would you approach this problem in UE4 by applying this script. Sorry we are beginners and just have a little idea about c++ (not much experience) and don’t know where to begin (we are using blueprint and not c++ at all).

I’ve been doing a bit of this for a TV show. I’m using an oculus as a tracking device but in studio i’ll be taking data from a bespoke tracking system. I’ve got a plugin which can modify the projection matrix and you get some pretty amazing results - amazing through a camera anyway - the effect is lessened a little with just headtracking but it’s still there.

We have a similar problem where we need to adjust the frustrum. Is such a plugin freely available? Where can I find it?