Drawing to a texture.

TLDR; I want to be able to draw (by draw I mean write text) to a texture and turn into into material for a mesh.

The long version;
I’ve done a lot of searching for how to do this and it accrues to me that something that appears so basic a task and such a massive lack of information on the subject that I’ve found a limitation with Unreal Engine, thou I’m at odd with the philosophy that such a technically advanced engine such as unreal engine v5.3.1 couldn’t possibly not implement something that seems very basic in concept.

I’m trying to create a box of disks that you can flip though my options are such;

A) build all the disk label textures and important 1000’s of textures and convert them to materials, very likely there isn’t a GPU on the planet that can hold the massive amount of textures.

[the path I’ve somewhat followed currently]
B) build and external application that builds the disk label textures, only requiring 3 disk label textures to be rendered at anyone time.

C) Find some way to get unreal engine to draw text to a texture., this would be the ideal way to do it because, it doesn’t require an external application (B) and it doesn’t require disk space to buffer the 3 image also (B). it would just reference a data table and update the textures depending on which disk you’ve flipped too.
[[ however I’ve not found a way to draw text to a texture ]]

– for those that don’t know what a disk box is because you far to young;

here is a picture of one.

hopefully you can use your imagination to figure out how you’d flip though each disk and why it only really requires 3 to be visible at any one time, the image above the red disk always needs to be visible because it’s the front most disk, as you flip though it will always be the front most disk, but the disk your on would be visible as the disks in front of it are push forward, the 3rd isn’t visible as it’s behind the one that is visible its only design is so that when it is flipped to it doesn’t need to be loaded as it’s already buffered.

I know this is a necrothread, but I also note you haven’t had any responses yet, so in case you’re still working on this – perhaps with an interim workaround of swapping premade textures – I have some ideas to consider.

Method 1: World space UMG

UMG already has the capability to draw multiline, rich text fields with arbitrary fonts and text effects, colors, etc., and to place those in world space. Could you make a simple planar quad that is a transform child of the front or back face of the disk, then render the text that way, letting Slate do the heavy lifting?

If having the widget follow the transform is a problem, an alternate implementation would be to render the UMG widget “offstage” on some arbitrary static mesh, capture it with a Scene Capture Component, then use the SCC’s render target as an input texture to a regular opaque lit Material Instance on the disk. (This could also be a decal MI if you prefer to have the paper label independent of the text rendering.)

Method 2: Modular premade textures

Render a moderate number of premade paper labels – without the disk identifying text, but with any simulated vendor logos and brand names that would have been pre-printed – to a pair of TextureArray2D assets (one for disk front, one for back). Make a Material and Material Instance(s) that accept an integer parameter to select the array element. Now you have the ability to simulate the purchased empty media from however many manufacturers/brands you prefer.

Render your text similarly, to separate TextureArray2D assets, but make these one-bit-per-pixel masks so they compress extremely well. If you carefully choose a slightly lossy compression algorithm, you might even be able to turn artifacts into a feature simulating slight variation in the written or printed text from label to label. Even if you use lossless compression, this simple mask should compress so small that hundreds of them can realistically be part of the project.

I don’t know how many textures can be in a T2DArray, so you might have to break them into categories and have a Blueprint or C++ ActorComponent on the disk Actor that creates an MI Dynamic and then selects which T2DArray to assign to the parameter depending on which disk that Actor is currently representing. The lookup could be done with a data table or by naming the T2DArray assets with a sequential suffix and letting the ActorComponent generate and then resolve a soft object reference path at runtime. (This could also support asset streaming if needed.)

Method 3: 3D Text Actors positioned like decals

There is a plugin from Epic to do world-space 3D text rendering. It would not be the most efficient method, but for prototyping you might save some time by rendering your text that way, keeping a reusable pool of these for the max number of visible disk labels, and making one a transform child of the specific disk. I would expect that Text 3D might break if you set its thickness to zero, but it might work with a thickness of 0.001 centimeters. The disks are about 9 cm square, so the dynamic range of sizes should be within what a 32 bit float can handle, yet at reasonable viewing distance 10 micrometers is going to be sub-pixel.

I’ve used world space UMG in a project, and although getting the position to line up where I wanted it was tricky because of the orientation of axes, it worked out okay once I got the math correct. I’ve used Texture2DArray multiple times, though mostly with Landscape materials. The last method, with the 3D text, is not something I’ve needed to do.

If you’re still working on this, or if someone else stumbles upon this thread, I hope one of these suggestions is helpful.

One other possibility, for the future: As I write this (July 2024), UE 5.4 has a new experimental plugin called TextureGraph that allows edit-time or runtime procedural texture generation using a node graph. I haven’t used it yet, but I’ve used similar capabilities in Substance Designer and Blender, so this may be an option in future projects.