I’m trying to calculate the texture mip level to use. I was trying out ComputeMipLevel and noticed something strange. It was giving me about 30 to 40% too low of a mip level. It’s quite noticeable that it’s giving the wrong mip level since I can see the texture is too blurry as you zoom out.
So I wrote my own function to calculate the max DDX or DDY (UV texel) distance assuming a texture size of 512. IOW, if the object on screen is 512x512, I should get a distance of 1. But I don’t. With a distance of 1, it’s about 600x600.
Here’s where it gets weird. If I make the viewport window larger (vertically), the distance value changes even though the object stays the same size. That should not be happening. I’ve written enough DirectX and OpenGl code using these function to know how this is supposed to work.
Is the viewport doing some kind of stretching I’m not aware of? Why are DDX and DDY values not consistent? Right now, they’re useless for mip calculations and I’ve used these all the time in DirectX and OpenGl where they worked properly.