Download

workflow for using displaced meshes

I’m trying to use displacement as the basis for some procedural generation. I’ve seen that displacement only exists in GPU so for example I can’t expect a line trace to actually detect the position of a deformed mesh, I’ll always get the basic geometry.
I’ve seen that someone suggest to use a render target and then to read the individual pixels.
Although this is acceptable for my case, I’d like to know if there are other ways: my understanding is that displaced meshes exist only as pixels, but that means it should be possible to export the displacement as a texture. The difference with the render target method would be that I’d be able to have a unique texture with the correct UV mapping, instead of taking an individual “photograph” of a portion of that texture.
Any suggestion?