Pretty simple question here: the TextureSample node takes a texture and outputs the associated RGBA channels. Is there a node that does the reverse, taking RGBA inputs and outputting a texture?
I’m sure you can imagine the utility of this- it would let me, for instance, to take three images that I’ve processed to my liking, and turn them into a normal map.
Seems like an odd thing for there not to be a node for, as it basically represents the ability to decompose channels, alter them on an individual basis, and then recompose them, something that would surely be very useful.
In blueprints, you can use the DrawMaterialToRenderTarget node. Just keep in mind that some inputs(e.g. AbsoluteWorldPosition, VertexNormalWS, etc), don’t exist for RenderTargets, and UV unwrapping is a much more complicated subject. If you draw to a RenderTarget asset, you can sample it in a material as you would any texture asset. If you draw to a RenderTarget2D, or CanvasRenderTarget2D, that is created at runtime, you can use a TextureSampleParameter2D or a TextureObjectParameter in your final material. Then, in blueprints, CreateDynamicMaterialInstance with your final material as the parent, and use SetTextureParameterValue with the RenderTarget you drew to as the value.