textures best practices

Hello,
I’ve been working on a project for about a year now. I have around 440 textures in it. I would say around 95% are 2048x2048 and I didn’t do anything to them…just imported them, had the mipmaps autogenerate, and used them for materials (as emissives…there’s no light in the game…so no PBR or normal maps or anything either). Around 5 are flipcards.

I’ve never come across any best practice info on textures. Can all my textures be 2K? Should some be 1K or even smaller? The player gets pretty up close with most things so I wanted as much detail as possible and I want it to be 1080p. Now I am getting the texture pool error sometimes. I don’t want to increase the pool size as I want the game to work on medium sized gaming (with around 2GB GPU ram) machines, too. Is it ok to have over 400 2K textures? What else can I do? Should I mess around with the LOD Bias, Texture Group settings open half of them up in Photoshop and change the size to 1K (I hope not!) etc? What do you all do?

There must be an article about this somewhere that I’m missing in my searches…

Thanks for any help!

Are you able to reuse and tile textures? 400 unique 2k textures is a lot of memory. Anything that’s small, in the distance, or the player doesn’t get near doesn’t need a 2k texture.

Some textures are tiled for floors and landscapes. But that’s right below the players feet so that should be 2K? There are a ton of characters and rooms, painted in Zbrush or Blender, a bunch of buildings you get pretty up close to. I didn’t make really much of anything that the player can’t get close to. Is there an easier way to change texture size in Unreal I don’t know about? Or does everything have to be Photoshoped and re-imported to check if 1K is ok?
Also the game has 5 levels…each level has around 90 unique 2K textures I’d say. What would be recommended?
Thanks for the help.

A good habit to get into is to check your texel density is consistent (including lightmaps in a production environment as well) and consider your target platform. Even on PC you can’t go buckwild if you want a wide range of consumers to use your products and 400 2k textures seems like a lot although I expect only a subset of those will be resident in memory at the largest mip at any given moment.

Often you’ll be able to eyeball texel density, but if you’re not sure you can create a checkerboard material (a basic version obviously exists as an engine material) to help you visualise texel density. An object with 1 square metre of surface area obviously only needs textures 1/4 of the size of an object with 4 square metres of surface area. In reality it’s probably going to be easy to spot the difference between these 2 objects since one will appear a lot higher res than the other. But the difference can be subtle.

In studios artists will consider texel resolution during asset creation. Usually a lead will provide reference objects or something an artist can use as well as scaling guidelines for level designers as an object’s world scale obviously has an effect on texel density.

I guess the moral of the story is try not to waste memory. So if you can cut down texture sizes with no loss of visual fidelity then I’d do it but then I started games dev on the ps1 and there was not room for waste with so little memory. It’s all about habit, a good dev knows a stitch in time saves 9 and all that.

Each 2048 texture is about 2.6 MB when default compressed (DXT1) with MIP maps.
400 textures will then use about a gigabyte of texture RAM. Fits fine on a Geforce GTX 980.
Not as good on a “standard” laptop for 600 bucks at Best Buy.

I second the suggestion to check texel density. If you (almost) never use the highest MIP level, you can save almost 4x the memory by dropping one level.

But yeah, just fiddle with the LOD Bias because it’s non-destructive.

Thanks everyone for the responses. Could you elaborate a little on fiddling with LOD Bias? Does that mean I don’t have to manually resize all the textures? What are some things I could try that would be effective? Like if I have a small object with a 2K map I change the LOD Bias to 2 or something but leave the 2K map in the content browser?

Also - I have 400 total textures but only about 90 per level. The GPU doesn’t take on all 400 right away, right? Only level by level? So is 90 per level that bad?

Thanks!

Yeah exactly, you can set the LOD Bias to 1 or 2 and you’ll immediately be able to see the effect by reading the info panel in the texture properties editor. You should also see an immediate effect in your 3d viewport if the texture you’re changing is there. So you can leave the 2k textures in for the small cost of slightly longer editor loading times. This gives you flexibility to change LOD bias on a per texture level or system wide if you so wish.

Also the GPU will generally only load in what it needs so it won’t load in all 400 textures unless all of your levels are being resident in memory at the same time. In this case streaming will be your friend and it’s another good habit to get into - only stream in what you need when you need it and get rid of it once you no longer need it.

Finally, bad is a relative term. Wasteful and unnecessary are more useful in the context of development. :wink:

Thanks so much this is super helpful and hard to come across info. Also the helpful vocab terms :slight_smile:
I don’t have a large open world…the levels load one by one as the player goes through the game. Would one still need to set up some sort of streaming thing if this is the case? I know there is some way to see which textures are in the memory but I’m not clear on how to implement that and understand it…

It sounds like you would still need some kind of sequential streaming for your game, how it’s implemented really depends on you. My experience is mostly based around having a master persistant which handles the loading of other persistant levels. So game_P would load in Level1_P (the names are obviously just examples) as well as other game persistant levels you need. Level1_P in turn is a controller level for the streaming of other levels including ENV, CINE, Collision, SND, etc (or whatever works best for your team or workflow). When you finish level 1, game_P unloads it and loads level2_P and so on. This is a simplified model but you probably get the idea.

You can use STAT TEXTUREGROUP to see much memory your various texture groups are consuming (I’m not sure whether this takes cooking into consideration or not because I don’t use it that much myself because I don’t do a lot on the art side these days). You can also use the statistics browers to have a look at Texture Stats too which will be enormously helpful to you.

I guess switching levels is another thing I’ve never come across any best practices for…
So having a master persistant with all other persistant levels loaded into it and loading/unloading your other persistant levels from the master with “load streaming level” is less expensive than just opening persistant level by persistant level with the “open level” node? Is that because this way you can do the “unload stream level” node? So when you start a game some things in all your persistant levels are loading unless you specify to unload them?

Thanks so much this is very helpful!

Nah, scratch what I said before…

Just chatted with some colleagues about this and solidified some thoughts on it… this is by no means the only way to do it but it seems like a reasonable solution.

It’s pretty different to how I achieved the same thing in UE3, but basically use the game instance to centralise your loading logic and cover level transitions in your level persistents by talking to the game instance. Or you can try to use the game mode but thats trickier since it gets reset during level transitions.

It’s a bit different to what I had in mind and it’s simplified for a tutorial but this tutorial I found should get you part of the way.

Have you looked into texture atlasing? I hear its easier on the system to reference one texture for multiple objects than one texture for each object despite the texture size.

This looks great! Looks like with 4.12 merge actors has been released. I will try that out for sure with some things.
@ChrisWillacy - thanks so much for that tutorial and all the help. That is super helpful. Interesting concept and I’m going to try it out. I’m having trouble understanding why Level Streaming, in a game where you have totally different levels, is less expensive than just using Open Level as you go through the game. When I put all my levels in a master persistent level it’s just a big huge mess, which I understand is ok, but it looks like it’s going to be a lot of work to get it all configured correctly and I’m having a hard time understanding why performance/efficiency wise this is a less wasteful option that just using the Open Level node…I know it seems the best way to go as I rarely see anyone use Open Level…It’d just be helpful to understand why I think.

Thanks!!

Maybe I didn’t explain very clearly so I’ll briefly go through both options in a very high level way.

Level streaming and “open level” achieve the same goal, allowing developers to load into memory what they need. You’ll generally need to use both unless you’re making a very small game but it sounds like your game is a candidate for both.

Level Streaming allows you to add or remove memory contents from the currently loaded persistent level. For example, you’ve reached the gate in the volcano level and you’ve airlocked the player. You can now load in the next section of data (env/LD/sound/cine/etc) while you display a cutscene and remove the previous section from memory. You use this to kind of do rolling streaming so that you never run out of memory while playing. Depending on your platform and memory usage you may have to do this a lot or a little.

Open level is used for when you want to stream in a discrete set of level data, for example you just finished your volcano level with all it’s environments, post processing, effects, sounds, logic etc. Now you’re moving on to the ice level with all it’s unique data & setups so having everything encapsulated by a persistent level can be hugely convenient. Obviously it will mean the old set of data is completely unloaded.

You can have your volcano and ice levels under one persistent level of course. But this presents extra challenges to you (such as lighting, post processing blending, etc), longer load times every time you load inside the editor, more bottlenecks (multiple developers wanting to use the same shared files), etc. Obviously it can get pretty messy if you don’t split up your levels so it’s common for them to be split that way.

I hope this is clearer. TBH, if you’re a hobbyist/enthusiast then I wouldn’t put a high priority on this. If you are trying to actually get this on a platform and you’re getting memory crashes then it might become more important to you. If you’re targeting PC just keep in mind that not everyone has a Titan X and aim for the middle ground, testing appropriately.

Thanks so much. You’ve been so helpful. I am trying to get this on a wide range of PCs and Macs. Probably lowest end would be a newish iMac or an olderish gaming PC. I’m not using any lights…all emissive materials which helps.
Over the weekend I implemented the level streaming method from that tutorial. I like it a lot as it allows me to use a loading screen and I think it will be easier to implement a saved game system this way. But yeah working on just one level is a little tricky as I have to go through the whole stream. But I put in some shortcuts to use while developing.
I also messed around with the texture LODs and shrunk my build from 3 gigs to 2.6 which is nice…hopefully i can get it down even more. I’m going to do the merge actors later as well. Gotta work on poly count now, too.

Thanks again everyone for the help!

No problem, good luck with your game.