This is an idea that popped up just right now.
If I would run for example a particle system that would be projected on to the stage like snow or a waterfall, could that be affected with a trigger, say a light sensor, or a motion sensor, activated by an actor on stage, in real time? Think of it as MIDI triggers/events. The black background in Cascade/Niagara would be some alpha/green screen pipeline I guess.
I used to run sound in theatre and the cues can really ruin the entire thing if they aren’t in natural sync.
This could also be explosions, fire, entire environments, or natural disasters and heaven at the switch of a button. Alice in Wonderland could really fall down that Rabbit hole on stage fx, rotated on her side, as if it was top view.
And imagine the money possibilities for us hobbyists, by fx. rigging up a haunted house at your local town festival, getting paid to use UE4 making something cool. We did that using darkness only, the kids scared each other, and some got traumatized So there is definitely potential for projecting things in everything from theater to a circus or a concert. Great for tiny venues to make some fake scenes layered on top of their actual structure, in the right proportions. The drummer could trigger a volcanic eruption with his foot.
Perhaps this might be great when shooting film too, everything rendered in sync from day one?
It looks like I will be doing assistant work in our local film/tv/VFX hub and there is at least one company that could use this in their huge outdoors projections.
Therefore, I would love to know if this is currently possible, or what has to be done to make it so.
Kind regards and thank you for reading.