Possible to Use Texture as Texture Coordinates?

Hello.

I’ve successfully managed to bake a UV channel into a texture shown below. The idea is that instead of using another UV channel I use this texture within a material that will later be baked down.
So I’ve got this texture into Unreal and hooked it up to the UV input and the results are really pixelated. I’m assuming this is a limitation of having only 256 values in each channel to choose from so it ends up looking this bad.
I’m wondering if it’s possible make unreal interpret these values more smoothly, or how I could better store these UV coordinates in an image if such a thing is possible?

It is definitely possible, but should not be practiced for this kind of things. A typical usage case, for example, would be when you need to distort another tiling texture based on input texture.
For your purpose, using additional UV channel should be the choice. It arguably consumes less memory, has considerably more precision.

Yes I know, but no UV channel in the mesh is better than having a UV channel in the mesh. And this baked UV texture is intended to be used in a really big complex material that will be baked down. So the performance of this material is not important because it’ll all be baked down into just your standard PBR maps. And ideally I’d want to have everything contained within this material.

Have you tried different compression settings on the texture itself?

In Unreal I’ve tried most but HDR seems to work best. The file itself is made by HighResShot console command, it renders a unwrapped mesh and saves it as an .EXR file. As far as I know I have no other option as to what to export it as. Changing much in Photoshop seems to have only downsides. Since it makes a 32bits per channel file I’m very limited to how much I can do in photoshop without lowering the bits. I was never clear on what the bits really do but I imagine it can only be a downgrade?
I tried both exporting as a seperate .EXR into my content folder with uncompressed in the export settings and I tried just moving it from the screenshots folder to the content folder. As well as saving it as just a photoshop file which I’d imagine has the least compression.
I’ll try a few others but I think this is just a limitation by how many colour values can be stored in an image.

Do you have any reason to save it out? If you draw it to a render target in HDR, it won’t lose any precision when you sample it. You can use an orthographic SceneCapture2D, align your unwrapped model beneath it and activate it for 1 tick. Then, if you’re drawing to a render target asset, you can sample it like you would any texture. Or you can drawn to render targets created at runtime and pass them as texture parameters in dynamic material instances. Set your texture sampler to linear color for HDR.

The edges of the baked uv will create seams, unless your second texture is sampled with (r,g)/(1-a) from your render target, instead of just (r,g).


ec191f61207e48b7b7b87335af7f55b6a869d8ce.jpeg
f83286640fcc5c049fdd618788035f8f2bc15805.jpeg

Left is using the baked UV, right is using mesh UV.

6262a1a433f8f7ffc2c0e1c848e16af4b9616295.jpeg

Oh, one thing I realized: Final Color is LDR, so even if you export it in HDR it won’t do any good. Looks like the highres screenshot tool can export buffer visualization targets. Have you tried exporting those and grabbing the scenecolor one? Not at my workstation anymore.

Thank you for replying! Yes ideally I’d like to get quite a few of these baked UVs out and so that I can share this between more people and use between many materials. Looking at the scenecolor buffer it just appears fully black, I got the texture from the scene basecolor. I followed the baking materials to textures live training so apart from a bit of experimenting that’s the extent of my knowledge.
Regarding drawing the texture to render target, how could I achieve this? I don’t have much experience with this, but I can’t seem to get anything to draw to my render target. I’ve set up a custom event that I call in the editor but it doesn’t seem to react, which node(s) exactly should I use?

In the above, in my scene capture component, I just set the texture target to the render target. But another method just uses the DrawMaterialToRenderTarget node. For example, to copy the render target used by a scene capture component, just make a material that samples the render target captured to, and output it as emissive. Then if you draw to a different render target of the same resolution will copy it.

If you want to make sure it’s working, just BeginPlay->GetPlayerController(0)->EnableInput in whatever event graph your working in, then use a key press node of your choice to trigger it.

I found a way to do it though, took awhile. Shouldnt be this hard.

Setup the unwrap material above. Create a camera actor at (0,0,0), set to Orthographic, with OrthoWidth 100. Set X rotation to -90 and Y to -90. Set auto player activation to player 0. Any mesh you put the UV unwrapped material in will be aligned with the camera actor.

Now create a level sequence. Under the play button menu, set start to 0 and end to 1. Click the movie button “Render this movie to a video, or image frame sequence” Set output format to Custom Render Passes. Set resolution to custom 2048 by 2048. Add Render Pass: Pre Tonemap HDR color. Check Capture frames in HDR.

Record, and the UV will be saved to disk in HDR as an EXR file. When you import it, set Compression settings to HDR, the texture filtering to nearest, and MipGenSettings to NoMipMaps It doesn’t write alpha so you’ll have seams if you don’t correct for the edges.

You can sample the texture with this material and draw it to a render target to correct the seams.

This material looks for pixels that aren’t black but have black neighbors. For those pixels, it replaces the color with the average of the non-black neighbors.

Wow. I’m really really grateful that you actually took time to make a detailed example! It works great!

I wasn’t actually aware that SceneCapture2D objects actually had a texture plugin, that was probably my mistake…
Anyway I set it all up and went through the level sequence and it works perfectly. Though I have a few questions. I’m not going to pretend I understand anything about the material, but one thing I notice is that the padding seems to come out only yellow (or I guess 1,1) when visualising the UV texture, not sure if this is intended or I hooked up something wrong? Would it also be possible to get more than one pixel of padding?

You’ve been really helpful, thank you!

Ah I found the problem, I had input a wrong value in one part of the material. It works great, thank you! I don’t have any bleeding so I don’t think any more padding would really be necessary.

You’re not planning on actually using this in a game are you? That HDR texture is 32MB of video memory on it’s own…

As I’ve mentioned this is being used in a complex procedural material, it already uses over 60 textures and a tonne of lerps. This material will be baked down into your standard 4 PBR textures. The reason I needed this UV texture is so that the actual mesh that uses those final 4 textures doesn’t have useless UV channels.