Storing displacement as alpha channel in a normal texture: not possible, but is it even a good idea?

I understand that packing grayscale channels such as roughness, specular, ambient occlusion, or displacement/height as channels in a single RGB texture or as the alpha channel in a diffuse texture is a smart performance trick as you reduce the number of texture samples needed.
I was curious that since the alpha channel in a normal map is not used, why not store it there? I know that Unreal prohibits this by requiring texture samples that input to the Normal pin must have “Normal” as the sample type - thus completely discarding any use of the alpha channel. Is there a unavoidable technical reason for this? Would there be any extra problems with this approach if we were able to use the alpha channel?

And I guess on a related note, why is it that packing several single-channel textures into a single multi-channel (RGBA) texture so much more efficient? Why wouldn’t one channel of an image be equivalent to a separate grayscale image? Because of things like UV coordinates being shared in a single texture’s channels?

1 Like

dice did this with battlefront, you can strip out the blue channel from a normal map too and derive the normal Z in engine from the red and green.
basically you pack whatever mask you need in blue and alpha then set the texture to use bc7 /w alpha
will update with pics in a while

1 Like

heres a few pics of a uni project i did based on dice’s work last year. texture @4k for normal map goes from 24815KB to 10923KB, with added info for masks.

Does this compressions settings apply to any extra stored information or just if we are using Displacement/height?
Would the same still be true if I store 2 masks, one in blue and one in alpha?