Download

Multicamera with nDisplay and pfm files

Hello there!
I have been struggling with finding ressources on how to match the camera frustum to the PFM of my LED walls.
It seems that the siggraph talk goes only briefly about it.

I got it to work with a simple projection setup but not with the easyblend.

Is there some kind of documentatino I’m missing ?

Bests,
Antoine

Hey prsynth! I’m sorry to hear you’re having trouble. I’ve reached out to some colleagues and I’m just waiting on them to get back to me so we can address your question. Thanks for hanging in there and I hope to get back to you soon.

I believe the typical LED wall configuration uses either the “mesh” or “picp_mesh” projection policy in the nDisplay config. You basically create a mesh with the size and dimensions of the LED wall, and add that to your level. Then, you use a function from the nDisplay Blueprint API, called Assign Warp Mesh To Viewport, to tell nDisplay what geometry it should use for each viewport in the config file.
However, LED wall setup is a hugely complex topic! We’re working on a complete walkthrough that will come with a sample project and detailed step-by-step instructions to get it running. Right now it’s 102 pages and still incomplete! I apologize for the inconvenience and delays, but we are working on it and something should be released in the coming weeks.

Hey,

Firstly, i am glad to read that epic is still taking this seriously and working on improvements. :slight_smile: (gogogo)

What helped me as a good starting point is a video by lastpixelstudios here : https://www.youtube.com/watch?v=gec9idRXpxo
But you need the basics from Ben Kidd’s introduction here: https://www.youtube.com/channel/UCh_…1D3WJeFS1FPuhw
And then also download a example scene by Vitalii Boiko’s here: https://drive.google.com/file/d/1QZQ…RhlzgAVN9/view

With the 3 above i was able to get everything working for my needs. I did change some stuff around and implemented some more features like tracker offset at runtime and some other stuff.

I think Epic really needs to look into some of the implementation, as it is quite a time consuming headache to get some stuff working.
Alone the need for that nDisplay cfg file is a absolute pain and should be implemented as a simple feature. Also easier implementation to get any location data from sensors would be nice(vive without steam, Ncam, intel and any other in one standard).
Also, the need for the external nDisplay program is a wired one. For a server OK, but for a on site machine painful. A play in nDisplay with all the settings set in the editor, that would be a dream.

Hope the above helps, if i can assist with a blueprint, let me know.

Given you went trough the above links and projects, i attach my blueprints here, maybe that helps a bit more in the process and saves someone else getting more grey hair.

Above Start and End Pos are for movement on key press to simulate easy movement scenes.

…hope this helps.

1 Like

We finally got a sample project and getting started doc to help with setting up In-Camera VFX!
Blog post here, with links to everything: https://www.unrealengine.com/en-US/b…ual-production

2 Likes

Hey,

Thanks so much for that heads up. The new doc and demo scene is fantastic.

Only thing i got a bit stuck on was the web-remote in the demo scene without in depth guide, but that is off topic to this thread.

Hello, thank you for sharing your BP. I try in vain for three days to set up nDisplay for 3 screens. I use two projectors and one monitor. I created one static mesh and copied it for each VP. The preview (rtt_inner) is displayed curved and wrapped on one of the VPs. Unfortunately, the images are mirrored to me. I use HTC Vive controller. Apparently I have something wrong set up. After creating a new room, everything turned around. I have an RTT inner on the other side of the controller. I use scenes from Vitalii Boiko. I apologize for the text. I use google translator. I would like to make a sci-fi short film and I do everything at home.