Wrong (?) MipMap used for scaled texture on UI

I am using a lot of buttons and other images on my UI and I have a problem and I believe this has to be a bug…

First, this is how it looks:

http://puu.sh/hrEoa/913b6cce7a.png

Two image widgets. The image on the left is 49x49 pixels size and the image on the right is 47x47 pixels size. The texture which is drawn is the same for both images, it has 512x512 resolution and it generates Sharpen10 MipMaps with Tri-Linear filtering and TC_UserInterface2D compression.

You see the left image looks good and the right image looks ugly. I believe this is because the right image uses the 32x32 MipMap and scales it up and the left image uses the 64x64 and scales it down.

I think for any size over 32x32 the 64x64 Mipmap should be used, right? Everything else just looks horribly ugly… Why does the engine use a small mipmap and scales it up if there is a bigger mipmap which could just get scaled down.

Why I need to use mipmaps at all you ask? Because using no mipmaps is also ugly due to the very bad downscaling algorithm which is used… With Disabled Mipmaps and Tri-Linear Filtering (which is the most smooth one possible) it looks like this:

http://puu.sh/hrQzk/8022977fd7.png

Extremely hard and pixelated edges.

Could you please fix this for 4.8? I would really like to have a good looking UI… :slight_smile:

Hey Alcatraz,

So I ran some test using the same texture settings you provided as well as some variations. What I discovered is that the MipMap settings render differently when playing in PIE in the Viewport versus Fullscreen. What is the desired maximum resolution you will be rendering to when playing this game fullscreen? Also, did you take the screen captures while in PIE mode?

Play in Editor (PIE) Viewport

As you can see, the issue you are reporting is appearing while in PIE within the viewport.

Play in Editor Fullscreen

Pressing F11 will enter fullscreen mode while in PIE. I did not change the order and it is a bit hard to tell within the images, but if you do the same within your project or run in Standalone mode you should see the changes happening to your image as well. This is why I ask your screen resolution as this seems to be part of the issue with the mip map generation.

Thank you,

Hi Andrew, thanks for testing this out!

I did create the screenshots in in a new maximized editor window, so almost fullscreen.

Did you have DPI Scaling disabled or enabled for your tests? When I switch to maximized window all the images get a bit larger because of DPI scaling, so what is something like 40x40 in the small editor viewport is something like 49x49 in maximized PIE.

I have not directly set the size to something like 49x49, it’s just an image widget which has a padding so that it’s ingame 49x49, I just tested out different paddings until I could measure 49x49 with my screenshots.

In maximized standalone game it does look exactly the same like in maximized editor window. In Fullscreen standalone it does look different because the DPI scaling scales the left image to something like 51x51 and the right one to 49x49, so both are using the 64er Mipmap.

No problem. I did the test within a new blank blueprint project and did not touch any of the DPI scaling settings. I believe it is turned on and set to the ‘Shortest’ curve by default.

I reduced the size of the images to something smaller and was able to keep the image to sample the 32x32 mipmap. I am going to continue to investigate this issue and will return here with any new information I discover.

Regards,

Have you not yet found any “new information”?

Hi, I’m having issues with wrong mipmaps loaded in UMG menus as well. Any update?

I have not found any way to fix this, so I don’t have an update, still hoping that AndrewHurley will “discover new information” :wink:

Thanks. Not sure if my issue is exactly the same, but here’s my question: https://answers.unrealengine.com/questions/226916/bad-mipmaps-in-umg.html

Do you have something like a bug number UE-XXXXX for this issue?

Hey guys I am currently still investigating this issue and would like to have a uniform test to run. I did find some new information regarding 3D Widgets and World Space versus Screen Space differences in rendering. Although you guys might not be using a 3d Widget, the information is valuable either way and will help you in the future. This deals directly with the loss in quality for images in 3d Widgets and UI.

The loss of quality comes from texture sampling (not pixel accurate) artifacts and a number of rendering effects such as post process which aren’t really suited towards UI. When you make it screenspace all you are doing is rendering UI like normal except just updating each frame where it should be located in 2d. Screenspace circumvents all 3D scene rendering. We will not be able to get around texture sampling artifacts. It is what happens when you are in 3D. 3D UI art has to be created in such a way that makes these issues less visible. We have plans to work around post process by rendering 3D widgets after. However, if you want true in world with depth testing there is no way to prevent applying post processing to these widgets.

@ joelr After reading over your post it seems the temporary workaround is to check the ‘Never Streamed’ option? Did this resolve your issue or was this pertaining to something different?

Thanks for your patience guys.

Regards,

Hey ,

I did get some new information about general UI guidelines as well as information on Mips you will want to follow to get the correct results for your own UI.

Basically you don’t want to use mipmaps for UI, it won’t ever look good (never crisp). UI should be authored as close to 1:1 as you plan to display it at. Making multiple versions for large differences, e.g. 1080 vs 4k.

The mip level is picked based on texel density compared to pixel density. A level is picked that most closely matches what is on screen. Nearest/Point filtering means no mips are being considered, you’re just picking the closest pixel to the UV coordinate.

This information was passed directly down to me from the developers and I would definitely suggest following with their advice when creating UI as it will be the correct and efficient way to approach your UI design and workflow.

Regards,

So far disabling streaming provides good results. It looks like mipmapping still works, in the sense that if I make my game window very small (640x480 for example) smaller mipmaps get used, and a large window leads to the biggest mipmaps being used. But I say this based on qualitative judging of the filtered quality of rendered textures. I.e., it does not look like the biggest mipmap got scaled down as that usually looks jaggy.

It does not seem to change anything if I tick “never streamed”. Might only work in your specific case because you use 3D stuff.

Thanks Andrew!

You suggest I should create multiple versions of the same texture for difference as 1080p vs 4k, but why do I have to do this manually? I would basically have to check every frame how the screen resolution is to make this smooth and then change the texture on buttons etc, would be really a lot of work to implement this. If just the correct Mipmap would be used I could just only create 4K textures and everything would work. Whether I manually scale it down or the Mipmap scales it down is no difference, if the correct Mipmap is used.

Where is the advantage in taking the Mipmap with the closest “texel density compared to pixel density” instead of taking the next higher Mipmap with close “texel density compared to pixel density”?

I like to think of Mip Levels as your LOD’s for your textures. You wouldn’t want to have 5 LOD’s on a static mesh that you will only be viewing up close. In that sense, consider your UI as being rendered at its highest quality up close. You wouldn’t need to have 10 Mip Levels if the distance from the player or the render size on the screen does not change dramatically. This is simply a waste of physical memory and is inefficient.

The advantage of taking the closest texel density compared to pixel density is almost self-explanatory in the sense that you will be rendering the sharpest version of that image because you are attempting to match the texels of the texture with the pixels on your screen. The 1:1 ratio mentioned earlier. This is why we say not to use mip maps with your UI as it will never be crisp. This is all the information and explanation about Mips in regards to UI that I can provide at this time.

I agree that in most menus you know in advance the pixel density you need. However, if we are talking high quality UI, on par with what you’d get from Scaleform or similar, image scaling can and will be animated. The whole menu can fade-in from a small size to fill the screen, as is my case. Without mipmaps you’ll have visible resolution pops during animation, or filtering issues while your image is less than half the intended final size.

I also have a case where some UI elements show up at different sizes because you can zoom them. If I have 3 zoom levels, and have to support the whole PC resolution range, I’m going to consume more memory by having each one of the texture sizes than by using mipmaps.

Yes if your UI is going to change in size a lot and animate then a few mip levels correctly applied would be a good solution. However, applying let’s say 10 mip levels to a UI element that only changes in size once will cause more issues and cost more memory in the longer run because it still has to store the data for the level somewhere in memory.

I feel it’s a tradeoff that’s worth it in some cases. One example of a UI I would not see any way to make in Slate / UMG because of such limitations is Hearthstone. Smooth scaling, particle effects, 3D page turns, post-process effects to blur the background when a card is zoomed-in. The fact that in Unity UI is done in the 3D scene is wonderfully flexible. Slate is nice because it has well-defined layout, which is great for the editor UI, but falls short so far for more dynamic UIs.

I think we have never talked about 10 different mip levels - I would be totally happy with having 2 or 3 mip levels for being able to support HD, FullHD, 2K and 4K smoothly without having to do complex logic manually which changes the textures based on resolution. As you said a few mip levels correctly applied would be a good solution, with “correctly applied” meaning that the next higher mip map will be used, so if I have one mip map with 10x10 and one with 20x20 and I need to display 13x13 then the 20x20 one should be used and everything would look fine. Could you ask the devs whether it would be a problem to include such an option as a simple bool to tick for example in the texture settings?

Hey again ,

I have some more information as well as a possible solution. In short, what you are attempting to have implemented will take some manual texture creation, there is simply no getting around that currently.

I did find a way to use specific MipLevel and MipBias within the Material Editor which you can use to select the MipLevel you wish to use for the selected texture. Use this in combination with some additional set up and you should be able to apply the correct Mip level at the desired distance from the texture image.

Mip Level/Bias Selection

Mip generation only works with powers of two texture resolutions, so be sure you are using those or else you’re not going to see any mip usage. Also the offline mip generation is handled by the editor. While if you launched the packaged version of the same project, the Mip levels are selected and rendered by the GPU based on distance and algorithm used. Long term we’d have multiple resolution textures, not mip chains to solve this.

Multiple Resolution textures would only involve 2-3 levels of textures. Rather than how mips work, which is a power of 2 reduction, 1024, 512, 256, 128…etc all the way down. No need for all that extra memory being allocated when all you’ll likely ever want is either the High or Low res target image for the UI, and you don’t want some just simple down-res, you probably want an artist making a custom version for low and another for high.

Regards,