[UE5.2] Problem with DDX and DDY to calculate MIP level

I’m trying to calculate the texture mip level to use. I was trying out ComputeMipLevel and noticed something strange. It was giving me about 30 to 40% too low of a mip level. It’s quite noticeable that it’s giving the wrong mip level since I can see the texture is too blurry as you zoom out.

So I wrote my own function to calculate the max DDX or DDY (UV texel) distance assuming a texture size of 512. IOW, if the object on screen is 512x512, I should get a distance of 1. But I don’t. With a distance of 1, it’s about 600x600.

Here’s where it gets weird. If I make the viewport window larger (vertically), the distance value changes even though the object stays the same size. That should not be happening. I’ve written enough DirectX and OpenGl code using these function to know how this is supposed to work.

Is the viewport doing some kind of stretching I’m not aware of? Why are DDX and DDY values not consistent? Right now, they’re useless for mip calculations and I’ve used these all the time in DirectX and OpenGl where they worked properly.

I just tested ComputeMipLevel and it appears to be working fine for me.
Mip 0 when my box was 512x512
Mip 1 when my box was 256x256
No change in mip level when scaling the window vertically as expected.
One thing you may be missing is the screen scaling. If your viewport resolution is scaled down, then your mips will be impacted. For example, at 50% screen percentage, mips will be biased by a full level.

1 Like

Yup, it was the antialiasing method. I needed to turn on “Automatic View Mip Bias” in my samplers and now everything looks good. I didn’t understand what that setting did before and didn’t realize the view was being upsampled.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.