Long-time reader; first-time poster. I’ve created a Blueprint for the purpose of streaming a third person view of a level into [TouchDesigner][1] using the [Spout][2] framework. [The plugin I’m using][3] can send several textures via Spout.
I need more data from the level than can be contained in a single render target, so I’ve created a Blueprint that contains 2x Scene Capture 2D Components and creates 2x Render Targets at runtime to render into and send through the Spout plugin.
As you can see from the above images, the two capture components are updated on every tick (event node pasted from much earlier in the chain for reference) and their render targets are piped into two Spout Senders.
The “DataCaptureComponent” has a blendable post process material that renders two mattes from the scene that will be used for compositing in TouchDesigner.
That’s the backstory, now here’s the problem: the textures are slightly out of sync in TouchDesigner. Somewhere there’s some latency being introduced and an offset is appearing between the beauty render target and the data render target. This doesn’t necessarily surprise me, but I’m hoping that I can narrow down the list of suspects here. Should two render targets that are updating from two different scene capture components on every tick show the exact same moment in time or is an offset expected? The latency appears to be less than a frame and is only noticeable on fast moving objects, here’s [a sample video][7]. Am I going about this wrong? Is there a better way?!
Thanks!