Is NDisplay suitable for cheating a LAN multiplayer game???

Hi,

I am looking over the docs to see if NDisplay is suitable for a LAN Game.

Concept:
It’s a Third Person game, where we hook up all the input to the Master Node, and each Client Node in terms of NDisplay track different players POV. We are trying to stay away from server authentication and multiplayer programming. So instead we are just building a singleplayer game, using NDisplay to achieve multile Player’s views, at the same time retain the visual fidelity.

Problem:
The demo videos are mostly Cave/Dome like productions. I am wondering if it is possible for each Client Node to see seperate Third Person Player’s POV, away from one uniform transform.

Also based on https://docs.unrealengine.com/en-US/Engine/Rendering/nDisplay/QuickStart/index.html , it seems like the transform somehow needs to be networked?? This made me wonder, if it is still networking the game logics, rather than somehow streaming only the image. So in that case, isn’t the whole game still need to be networked??

Again, my goal is to simply build a LAN game with multiple Third Person Character Cameras, and distribute each cameras workload onto different GPUs, instead of going the multiplayer route.

It’s a simple concept. Hope someone could shed light on if NDisplay can make this possible. Or is it only useful for syncing cameras stuck with one uniform transform in the world?(like a 360 degree camera system)

Thanks!

Ndisplay is not what you want, Ndisplay allows you to distribute rendering of one image across multiple computers over a network.

What you would probably need to do is run multiple instances of the game that are assigned to different GPU’s and then have those instances communicate still with networking but instead it would be connecting locally to the same machine

@**darthviper107 thanks so much! This really helps!

However, how would you be able to assign different displays to multiple GPU?? Is there a link or guide I could get started with?
In this case, do I need a SDI Desktop? I am thinking of using 2080 ti as the master graphics card to render a live view, and a few others to display up to 8 player’s player camera within the world, with a much lower graphics quality(similar to a AAA mobile game quality, so like a gtx970 could probably run 2 screens). So is a traditional PC reccomended or there is a different route to build this kind of PC?

This is a commercial project but I am still trying to stay within the budget, while staying away from multiplayer codes.

Also, a few additional questions to clear up my NDisplay confusion:

1.What would be the reason for saying that NDisplay is not the optimal sollution? Is it because the ‘one image’ you are refering to, has to stick with one transform within the level viewport?? So NDisplay is only used for displaying a 360 view???

2.When I looked into NDisplay, I stumbled upon Cluster Input with VRPN. How would that be different, from synchronizing input with multiplayer code, or plugins such as PHOTON NETWORKING? Should I use Cluster Input to handle LAN Multiplayer Inputs??

3.Based on the Cluster Event, and Cluster Input page, it seems like the NDisplay system is still trying to broadcast certain messages. Does that mean, if I create a game running NDisplay, I have to change the logics and build a Cluster Event Wrapper around all my singleplayer events???

Sorry for the confusions. Just there is only a documentation page to look at, and from what I am fathoming it feels like building a basic multiplayer game again.

Thank you so much sir!**

clearing up the confusion… I got what you mean, so basically still deploying multiplayer game across different machines…

Isn’t this solvable with NDisplay though?

Scene Configuration and Camera Configuration should be able to pull this off?

Still hoping someone has experience with Scene + camera config can help