Is there way to use sequencer output as texture in realtime?

Use a scene capture component instead of the camera your are using in the sequencer, it works just as a standard camera but can output to a RenderTarget which is used as a standard texture but is updated each tick to show the camera output.
So it can be used in both UMG and inside a material applied to something like a TV.


I don’t think sequencer will treat is as a camera though, so you might have to attach it to the camera used in sequencer. Or build something with blueprints that make the scene capture component follow your camera :slight_smile: