Float to texture?

As far as I can tell, there’s no way to turn floats into a texture. As an example for how this would be useful, I’d be interested in using floats at the input to a HeightToNormalMap node (which only takes Texture2D), though I’d have other uses for floats -> textures as well.

It seems particularly odd that this can’t be done, because the TextureSample node TAKES a Texture and outputs floats.

If I could find the code that TextureSample works off of, would I be able to reverse that process and custom-make a node that does what I’m after? I’ve never made a node via code before. A nodecode!

(BTW I’m aware there are other ways to achieve the heightmap creation from floats situation I described. I’m not particularly interested in solutions for that problem specifically- it’s just an example).

Bonus question: Would I be right in saying that texture channels are represented as vectors which the engine then turns into a square representation? i.e. the channel data doesn’t actually hold coordinate info, just a series of numbers that are later assigned to appropriate coordinates?

Well, there is nothing odd about it.

Scalar is just a value.
Sampling a texture however, gets you data, that depends on UVs, that you used to sample it.

Height to Normal node works by comparing height around around several nearby texture texels and decides which way the normal should point. The height it gets, depends on location of where it was sampled.
Does the scalar depend on anything? No. It is constant.

What you should be looking at, is a NormalFromFunction node. It allows you to input three version of a functions with offset to get the normal.
As for you bonus question, I don’t think I understand it.

I’m not talking about any oddness of the node turning a texture value into floats - what I’m calling odd is that this process can’t be reversed.

Nope. Textures are read only data from material editor. Just think texture as bitmap or image. It’s 2d array of colors. You don’t create those dynamically per pixel.

Fair enough. TBH most of the functionality I’d get out of this could be achieved with UV map manipulation.

I am curious as to whether there’s a hard technical limitation why this can’t be done, though. For instance, if you gave me a short list of float3s (of square length), I myself could open up paint or whatever and make you an image with those color values, so algorithm-wise it’s not exactly complex. Is it to do with how bitmap files are represented?

You can draw to render targets, and then access those as TextureObjects or through TextureSamplers. CanvasRenderTarget2D gives you the ability to draw lines onto it, so you can draw single pixel lines to represent a single vector. Here’s a post of mine which shows how to write the locations of several objects to a CanvasRenderTarget2D and loop over those locations in a material, rather than trying to pass many parameters separately.

It’s just matter where you want to create texture. Material logic is run per pixel and that is not the place where you could do anykind of memory allocations.

I’m having a similar issue. I’m trying to take an array of floats that I generated procedurally and make a heightmap texture from them, which could be used for debugging, displayed on the HUD (as a map, for example), or saved to disk so the game wouldn’t have to generate the map from scratch every single time.

I’ve got a semi-working solution here, but it creates wonky psychedelic results:

I know the heightmap algorithm itself works, since a 3D display of it using the Draw Debug Helpers shows up correctly. Is there really no better way to feed a bunch of floats to a function and have it spit out a grayscale Texture representation of those floats?