Hi everyone,
I’ve got an issue with render targets, and at this point I am not sure if there is something wrong with my logic, or a deeper issue with the order of how the engine renders texture targets. Let me explain:
I am making an utility to generate spritesheets in runtime, from a camera rendering some sort of chroma boxes. Here is how it is supposed to work:
-
There are two render targets, one used to render the scene in the chroma box (RT_CaptureCam), and one used to make the spritesheet (RT_SaveTextureToBuffer)
-
The material that has the spritesheet logic scales the RT_CaptureCam, then masks the correct sprite index, and then combines it with the RT_SaveTextureToBuffer RT. Here is the last bit (the sprite index logic works fine)
-
The Blueprint running this setup also periodically (determined by capture resolution and duration) a function that basically updates the material, then updates the RT_SaveTextureToBuffer RT with a drawn to texture version of the material:
- My expectation is that, if a material combines the RT with the RTbuffer, and then updates the RTbuffer with the result (B = A+B), it should accumulate over time
However, when I run it, this is what I see:
(The plane on the left has the material shown in the shot above, the one on the right only displays the updated buffer)
Any ideas? Like I said at the beginning of the post, I think my logic is solid but I might have been looking at this for too long
Much appreciated