we would like to let two computers render one channel of a stereoscopic image each. The two images will then be combined to a stereo image by the projector. Is there a way to let each computer render only one channel using ndisplay? So the first computer would only render the images for the left eye, and the second computer would only render the images for the right eye.
We have done this with our own rendering software, and it works great. Of course the projector needs to be able to accept two different channels (left and right) on two inputs and combine them accordingly, which our projector can do. This way we can effectively double render performance by using 2 computers.
However, it seems like this hardware setup is not directly supported by ndisplay, or am I wrong? (I hope I’m wrong…)
Any ideas would be greatly appreciated!
If you are still working on this issue:
There is no way to get this behavior out of the box in the launcher, however it is no problem to achieve what you want via the config file. I used this method to get full 4 screen surround CAVE with 4 viewports * 2 for each eye
Just define two nodes with each one screen and viewport. When you get to the VR section, define your projector screen once (X,Y,Z + distance from the center etc.) and link that to one of the viewports. Now you can do the same for the other eye: define a new scene node with the previous one as a parent and shift the corresponding coordinate by the standard or custom eye distance.
Actually it just noticed a mistake in my setup: nDisplay only supports one camera that looks in different positions. While this approach might look correct at first, it is not actually two cameras looking in the same direction, being eye_distance centimeter apart, but rather the same camera just pointing a slightly different direction.
It is as far as I know only possible to achieve the stereoscopic effect using the nDisplay Launcher, which is a problem for your and my situation. So if anyone has a solution to this problem, please let us know
I am working right now on the unreal source code. So far I got a stereoscopic view by adjusting the X and SizeX in DisplayClusterSideBySide.cpp however this is not realy a fix, because it still renders both eyes, thus there is no performance gain by using two clusters for each screen. Also set on one cluster eye_swap=true
It might however be a hotfix for time being.
If someone is already working on it and know what exactly one can change to fix this I am all ears.