I am trying to understand FFakeRenderingDevice class to render out SBS game which can be then played on smart TVs supporting SBS playback. Am using polarized glasses to see the stereo output. After building using -emulateStereo flag I noticed the stereo separation to be way too much than expected. Also, it seems that this separation increases with increased depth in the 3d scene, farther away the object from camera more is the separation. Is this expected? From what I understand, for me to neatly visualize the scene on such TVs I would need slight separation, in fact, separation will decrease with increase in depth & close objects will have more separation giving perception of 3d when viewed through goggles.
I did some basic changes to the code in FFakeStereoRenderingDevice class where input FOV to function GetStereoProjectionMatrix() is used to calculate HalfFov & width-height is set to maintain aspect ratio of 16:9 (1920x1080). I am facing the issue as mentioned above. Please refer to the attached image.
As you can see the evident difference if you keep crosshair as reference. Am also not quite sure about variable ProjectionCenterOffset = 0.151976421f used. This is used as translation value during projection matrix calculation, but not entirely sure how this value was chosen in the code. Can someone put some light on this? Am I right to understand that depth should not change so much which causes problem in resolving stereo when viewed through polarized goggles?
EDIT : Now that I am reading function CalculateStereoViewOffset(), I see EyeOffset is applied in reverse manner than it should be?
float EyeOffset = 3.2000005f;
const float PassOffset = (StereoPassType == eSSP_LEFT_EYE) ? EyeOffset : -EyeOffset;
shouldn’t this be the other way around?