I’m hoping to get some insight on a problem i’m having.
What I’m trying to do:
I’m trying to bake out rendering information from a camera view to a models uv space.
I used Ryan Brucks baking to render targets as a good starting point. I’m basically, capturing the scene to a render target. Then I change out the material and bake out the Texture Coordinates of the model to a second render target., and then switch the material back to normal.
Then, my problem is when I try to feed the UV coordinates Render Target into the Scene Render Target, it comes out nothing like I imagined. All I want to do is take the pixel and move it to its represented Texture Coordinate. So basically, I should see the scene elements on in the UV layout like the fire, smoke, ect…
I’ve been working on this for awhile now and I’m stuck trying to get the transform correct and I was hoping someone else could give a suggestion on what i’m doing wrong. Thanks!