How can I get dynamic text in a material expression?

I’m looking for something similar to the Text Render Component, but within a material so I can send a string to my material and have a text decal as end product. I’ll look into creating my own custom material node and reply here if I discover anything.

Any assistance in this process would be greatly appreciated.

Cheers!

~Josh

Looking for answer to this also

Hi - I’m also looking for this to display an updating number on the characters jacket…Josh - did you have any luck ?

I think this is simply not possible, as can be seen in other questions. I guess you are out of luck and should work around it with Text Render Component. If the texts are known in advance, you could have a texture for each text and use a texture2d parameter in your material, which you switch whenever you need.

Maybe you can combine this with dynamic textures, written in C++, as seen on the Wiki.

Hi,

Since this is the latest instance of someone asking this oft-occuring question, and I have found a solution, I’ll post it here despite the thread’s age.

Unfortunately, as you may have expected, this solution is not very nice, but it’s the cleanest I could find.

Basically, what you need is a tool called msdfgen (currently located at GitHub - Chlumsky/msdfgen: Multi-channel signed distance field generator) or something similar, though this will more than suffice. Windows binaries are available with the tool but it is cross-platform (I’ve used it on Linux).

This will be used to generate a signed distance font texture, as used in Team Fortress 2’s font system and in GPU Gems (I forget which volume). Basically, it’s the same thing as the distance fields used by the engine for occlusion and shadows, but in two dimensions, with 3 color channels (in this case), and without the fancy raycasting.

The first thing you need to do is go into commandline and run ./msdfgen.exe -o .png for each character you want to be able to use in a string (for the sake of conserving video memory, I recommend excluding anything you won’t use rather than listing every ASCII printable character blindly here).

Now you’ll have a lot of little images, use a tool like ImageMagick to stitch them together into an atlas (convert +append works well for this, or if you really hate yourself you can use photoshop to tile them manually). Import that as a texture, and set up a texturesample node in the material editor. Then create a material function.

(NOTE: I am going to list any material node expressions as equivalent HLSL code, it should be pretty straightforward to figure out how to convert it to nodes; alternatively, use an HLSL custom node)

The function cannot take a string as a parameter, because that’s not a parameter type, and Epic has unfortunately chosen not to support array uniforms in their convoluted material editor. So you’ll need to figure out yourself how to send the string to the material. There are two obvious ways of doing this:

  1. If you need a particular string, use a very low resolution texture with mipmapping and filtering disabled. The n’th pixel horizontally will be the ASCII code (remapped to your list of used characters) of the n’th character in the string.
  2. If you just want a random string, use a noise function with min value 0, max value equal to the number of used characters, and index it the same way. Note that this is normally tricky if you want to control how the string is seeded, but it can be useful if for example you don’t care exactly what the text says but want some easy variation, e.g. serial numbers on weapons or runic texts/symbols/hieroglyphs nobody will be able to read.

Either way you decide, you’ll need the following expression to compute the index into the string at the given pixel:

float2(floor(texcoord0.x*StringMaxLength)/StringMaxLength,0))

StringMaxLength is some upper bound on the length of the string (in characters). If you want dynamic length strings, just add spaces at the end or mask the result out where you don’t want text…I haven’t found a better way to do that. texcoord0 is an input: it’s whatever uv coordinate you want to use, scaled and offset so the text spans from (0,0) to (1,1) in uv coordinates. To get the text to tile vertically, use texcoord0.y instead. Likewise, if your string lookup texture is vertical for some reason, transpose the arguments to float2 so 0 is first.

And now, to compute the offset into the atlas:

(iAtlasWidthInTilesTileWidth+frac(texcoord0.x/TileWidth)TileWidth)/(AtlasWidthInTilesTileWidth)

here, i is the character index you computed and looked up previously (from the TextureSample or Noise node using the first expression as the input).

Now all we need to do is sample the atlas texture we created using the output of the second expression, and feed that as input “v” into the following expression:

v.x+v.y+v.z - (min(min(v.x,v.y),v.z)+max(max(v.x,v.y),v.z)) - 0.5.

This will compute the median of the three channels of the image by adding the components and subtracting the minimum and maximum, then subtract 0.5 to decode the signed distance value.

Finally, pass that last expression’s result as the “w” input into the following expression:

clamp(w/fwidth(w)+0.5,0,1)

And there you have it. Free, dynamic, randomizable, anti-aliased text with a premade or custom font, all in the material editor so you can scratch it and lerp it and blend it and distort it to your heart’s content.

1 Like

I appreciate you taking your time to explain what you’ve found here. I’ll dig into this later this week during my time off and I’ll post back if I run into any hitches. You’ve explained it very well though so I imagine it’ll be a-ok :slight_smile:
Thank you!

here is an example with Helvetica numbers.

1 Like

Ok cool. Two points:

msdfgen.exe command was truncated by markdown, should be for example ./msdfgen.exe -o 1.png Helvetica.ttf ‘1’

Second, I am working on a plugin for this so hopefully by then I’ll have a github link for you.

1 Like

https://github.com/vpostman/procedural-ue4-material-tools here is a version I got working with randomly seeded text. The atlas texture is A-Z, 0-9, and hyphen in helvetica (each tile is i believe 128x128) and the python script simply generates the small images in a loop so you don’t have to hand-type each character. You can just pass it font files as the first argument, and the images will be saved to the same folder as the input file. You can also change the atlas characters in the hardcoded string.

To adapt this to constant, prewritten strings, just make a lookup texture as described above, and replace the noise node with a texturesample node. Really you only need one image channel.

1 Like