So I am trying to use a UCanvasRenderTarget2D
to draw some images into a texture. Unfortunately, every time I draw something into it via the FOnCanvasRenderTargetUpdate
, the stuff I’ve previously drawn gets cleared. A bit of poking through the source code showed that the UpdateResource()
override inside of UCanvasRenderTarget2D
calls UpdateResourceImmediate()
which at some point got a bClearRenderTarget
parameter that defaults to true.
Alright, no big deal then, I just added a bClearOnUpdate
bool to UCanvasRenderTarget2D
, fed it into UpdateResourceImmediate()
and compiled it only to find that my data still gets cleared. I traced the execution chain all the way to FTextureRenderTarget2DResource::UpdateDeferredResource
where the bClearRenderTarget
is correctly set to false, and it skips the “clear the render target surface to green” execution block. But somewhere past the CopyToResolveTarget
call, the data still gets cleared.
Hi Damir,
Sorry for the delay.
CopyToResolveTarget doesn’t really have the capability to clear anything. It could overwrite the target texture if the source and target were different, but from the looks of FTextureRenderTarget2DResource::InitDynamicRHI they should be the same.
Can you confirm that in the CopyToResolveTarget call in question the source/dest are the same? If they are then the problem is elsewhere. Some other rogue clear or draw coming in before or later.
Are you specifically seeing it cleared to green? A breakpoint in RHIClear or SetRenderTargetsAndClear around the problem area should help narrow it down.
Hey Marcus,
The behavior is a bit weird. I’ve set it up in such a way that I assign an event to the OnResourceUpdate delegate (or whatever it is called, I am not in the office right now) and update the resource every 1 second. Via the passed-in canvas, I do a DrawTexture using a small square at a random point on the render target. The result should be those small textures accumulating, but I only ever see the last one drawn, the previous one gets cleared. It only gets cleared to green if I set the blend mode to additive, grey otherwise.
However, if I put a breakpoint at any point of the drawing process, and just keep stepping through, I DO see the textures accumulate, and when I remove the breakpoint and resume at normal speed, it gets cleared the next call.
Hope that helps.
Best regards,
Damir H.
Hi Damir,
We are marking this report as Resolved for tracking purposes. If you would like to continue investigating the issue, just post a comment to reopen the report.
Thanks,
TJ
I think the problem you were running into is that UTexture::UpdateResource() automatically “renames” the underlying resource for you.
Since the CPU and GPU are on different timelines you can think of “renaming” as a new surface is allocated for you on each UpdateResource() since the previous updates may still be in flight on the GPU.
So in this case “bClearOnUpdate” really means, “will you write all the pixels? in that case you can skip the initial clear.” On update the whole surface is expected to be refreshed.
If you can describe your end goals we can discuss some alternative approaches.
Wow that is really odd. Can you try console command ‘r.rhicmduseparallelalgorithms 0’ and ‘r.rhicmdusebypass 1’ and see if that changes the behavior?
Hey,
Terribly sorry for the late reply, I’ve been busy left and right with the Reboot Develop conference, hadn’t had time to get back to this.
I understand the explanation and that is most likely what is happening. What I am trying to achieve is to incrementally write stuff to a render target, i.e. draw something, then 5 seconds later, draw something to the same render target, retaining the results of the previous draw. I have found a workaround for the situation I opened the post for, but it would still be a good thing to know about.
Best regards,
Damir H.
Hi Damir,
Glad you found a workaround. Our plan for the future is to phase out the Canvas system. We are intending to write a more generic system to allow you to drive rendertarget rendering more generically, and more integrated into blueprint and the material graph.
We’re hoping to have a prototype in the next month or two. I can keep you up to date on it on progress on it.
Can you explain your workaround? Do you copy the results in another surface? Thanks.
That would be awesome, thanks a lot. Looking forward to the new system.
What I do now is spawn particles at set locations high up above my game and have a scene capture set to only capture particles grab those and put them in a render target. I am running into a few issues those, mostly the fact that scene captures are very slow even with only particles (and translucency) turned on and having a very small max render distance. Curiously enough the performance is much better in the editor than a packaged build…
Will the new system allow for things like that? I use the system above to e.g. spawn wave particles that slowly grow and fade out in order to spawn small waves in bodies of water as the player walks through them.
Is there any news on this prototype?
The replacement system can be driven via BP as well as c++, so for an overview see the BP node, DrawMaterialToRenderTarget. This is live in Dev-Rendering and will be released in 4.13.