Character model poly count

As far as character models go, how many polygons are too many polygons? I’m more so used to making models used for images and video, so I’m more used to making characters with pretty high polygon counts.

Take a look at this thread: https://forums.unrealengine/showthread.php?18120-Recommended-poly-count-for-player-models-and-weapons&highlight=character+count :slight_smile:

Some games have characters with as many as 100,000. Half-Life (my personal all around benchmark) has characters around 5-6K and they still look great.

It is highly dependent on what kind of system you are targeting. If you are going for high-end PCs and current gen consoles (i.e. PS4 & XBO) then you can have around 100k without a problem. For previous gen consoles and mid range PCs I would target 40k. For high-end mobile devices such as iPhone 5s+ and Galaxy S4+, maybe around 20k, and for any older mobile, it is highly variable among hardware, but you’d be smart to try for less than 2k.

Keep in mind that all game engines use triangles, so if you have 50k quads in your model it will become 100k tris on import to UE. When anyone says “polys” when referring to UE, they mean tris (where as polys usually means quads in models used for film.)

The trooper in the matinee fight scene pack is about 40k tris

57a0a26d1beff9db26aba177e6989d05dd5a1d2d.jpeg

Well you could included LOD’s, always best practice, but the problem of pushing polygons has been solved a log time ago even on older video cards. Rule of thumb these days is if an object needs to be smooth and round then make it smooth and round and key elements like heads should not look like they were bolted on with a wrench. In other words optimization is still good but over optimization is bad if talking about next Gen.

Would be nice though to have a complete overview or what forms of optimization is built into the engine as there is a lot of things that is now done in code that use to be done manually.

On the other side of the coin.

I’d be a bit more concerned about things like excessive draw calls as T&L hardware rendering is still a bit of a bottle neck but does tend to stand out if overuse is the causes but so far it “seems” to me that one would have to do something seriously wrong and abusive beyond the right amount of this an that.

Thank you everyone for your replies :slight_smile:
40K triangles should be more than plenty for my purposes. What about textures and normal maps though? Again, I am used to working with some pretty high resolutions materials within Maya. I should also note that I am targeting mid range PCs and possibly next gen consoles. And possibly WiiU if support is ever added for it. But as far as textures go, what is considered “standard” now a days?

2048X works well with little impact that I’ve noticed but once again you can afford to make up close look really good and then use lower resolutions on the LOD’s. Once again it’s about the number of draw calls and not so much about resolution and the more you can fit onto a single texture Atlas the better.

Normal maps is just one of those things you have to add and look at as to result as there is no real practical way to know if it’s going to do the job for you until you see it.

The Cadillac of materials though is Substance, procedural, or layered based materials as you can build up higher quality materials using much smaller support images that are far superior to the use of high resolution texture images.

Good overview of layering materials.
v=R7WNcUotwSQ
The other material element to look at is instanced material were a single material can can be used to create variations of the same texture with out the performance hit. (ie it’s a free lunch)

Long way of saying that in short short image based materials will be out and procedural will be in which makes a lot of sense since it’s a technique that’s been used in 3d editing in general for a very long time and the overall resource is incredibly low .

Overall material wise difficult to mess up performance wise once again and it seems to me that technology in general is getting to the point where fast is fast enough for most needs and it’s the software/hardware that’s doing most of the optimization heavy lifting that was once had to be done manually.