Because if you need it at runtime, that’s not the way to go. Instead read the pixels from the RenderTarget using the GameThread_GetRenderTargetResource and ReadPixels functions. Once you have a TArray<FColor> you can create a UTexture2DDynamic which is the correct way to create textures at runtime.
Sorry, for reviving an old thread, but I’ve been seeing this same explanation all day, but they all leave one important bit out: how to actually copy the pixels to the UTexture2D/UTexture2DDynamic. The constructors for them do not take a TArray<FColor>. Nor do any of their member functions. Below is the function I’m trying to implement, but I don’t know what to fill in in place of the comment.
Unfortunately, that does not seem to work for me. Perhaps because the RenderTarget is not square. It’s the same as the screen resolution. The Photo texture remains a blank white texture.
EDIT: It doesn’t work with the screen resolution set to 1080x1080 either, so I’m not sure what the issue is.
But for now, I’ve just decided to create a new TextureRenderTarget2D for each image capture and then display them to the player with a dynamic material instance on the image widget. I’m not sure what the difference in overhead is between Texture2D and TextureRenderTarget2D, but if I ever end up needing to convert again, I’ll keep your solution in mind!