In Unreal Engine if I want to render in 1 specific color channel like either red or blue or green only one of them and if I want to reduce color space/pallet to 2 bits only, that is only 4 possible grades of red is there a way to do that? without resorting to post-process since the whole point is to reduce rendering computation… to further explain what I intend to do: I will have 4 completely separate machines 3 clients and 1 dedicated server the 3 clients each will render the single channel assigned to it for example machine 1- red 2bits, Machine 2- green 2 bits, Machine 3- blue 2 bits with the dedicated server making sure that all 3 machines are in sync…
and yes I know you will ask why just 2 bits whats the point if you end up with only 64 colors??? and there is a good reason since I need to have very high fps in the range of 2000+ and my display DMD has this limit so that’s my reason for this strange question!
Maybe I should clarify that this is intended for volumetric display hence the high speed FPS.
Only option I can find is Ndisplay plugin and that will be great for handling the cluster of machines but there is no option to control color space in config file for ndisplay am I on right track here or can someone please suggest another better option to go with??? I have to reduce it to 1bit or 2bits for each one of the machines and make sure they are rendering specific color channel, since all 3 projectors are projecting on same moving surface they will combine with proper colors then just like old RGB projectors.
The forth machine now will become the master I guess with VRPN server.
So actually to be more realistic we need to go even further down in resolution somewhere in the 240p range and that would be fine anyway for use case I guess since its all going to be voxel based (blocky looking hologram).
Just wanted to clarify something new I was also confused about the frame rate and how to approach it but a friend of mine working on similar concept explains that unreal will actually be rendering at a very low frame rate! more like 24-30 FPS max in his own project however they used OpenGl Mesa directly to slice those frames into 8bit pattern images (think CTscan) and then display them rapidly at rates of 4000fps on 3 or more projectors for every channel in their case they used rotating screen and im planing to go with reciprocating display that goes up and down and my output might be the real bottle neck where im dealing with slicing each frame to approx 200 slices.
So since I don’t have the luxury of using ASIC or FPGAs to handle that part is it possible to do this through the rendered when using ndisplay? I guess even post processing is now an option then.
Wouldn’t the low frame rate required now evenout the performance hit of slicing every one of the 24 or 30 frames?