Hi. I have a problem with texturing.
My entire model has ~17 million polygons. Here is a part of it.
I made low poly version of it.
Texture coordinates was set and Normal Map was generated in ZBrush.
And here is a UE4 rendering of it with Material setting.
So how does people work with Normal Maps or is it forgotten tech? I have always had problems to bring my HD meshes to a game engine, especially with Normal Maps. No matter where u generate maps, ZBrush or Blender, results are pretty much same. It always looks so far away from the original work, losing lots of details. And no engine can handle like 17 million polygons.
A well made high poly, low poly, and normal map should almost look identical in UE4, until you start zooming in.
Normal maps aren’t handled the same way universally, there’s no standard that every application, baker, and game engine follows, so it can be a clunky process finding a workflow that works for you.
Make sure you are using a baking program that supports MikkTSpace (it’s a tangent bias for normal maps that UE4 uses), xNormal, Substance Painter/Designer, and Blender all should support it, ZBrush definitely doesn’t and I wouldn’t suggest baking normal maps in ZBrush (except tiling textures are okay).
Triangulate your lowpoly mesh before baking.
Check the smoothing groups or hard edges on your low poly before baking. You can have hard edges or smoothing splits where there are UV splits, but it’s generally easiest to put everything in one smoothing group/make sure there’s no hard edges, that’s how I would approach it for testing your model. Smoothing groups for game is a huge separate topic.
Again make sure you are baking in an application that supports MikkTSpace, and try to find if there’s any documentation on using that application and UE4. For example in Substance, compute tangent space per fragment needs to be checked, for xNormal, Compute Binormal in the Pixel Shader needs to be check in the MikkTSpace settings.
That should fix some of the artifacts I’m seeing on your UE4 model, there’s so inverted areas, seams, and areas that are not shading correctly.
So MikkTSpace is the key. I try to look into it. Thx for the fast reply.
If you’re losing detail, then it may be the resolution of the normal map.
Exporting from ZBrush with triangles did do the trick in XNormal.
In UE4 it appears somehow inverted, and i dont know why. Any Ideas?
Try inverting the green channel of the normal map, there’s an option for it in xNormal.
Ok, now it works. It looks pretty good considering it is small portion of a model. Thx for everyone.
First of all, this model looks AWESOME!
When you use hard edges on a low-poly mesh, you actually tell the computer to split the mesh and consider the separated pieces as a brand new mesh altogether. The vertices get duplicated, and you can have verts facing in different directions right next to each other. X Normal will bake down your high-poly 17,000,000 triangle model to a 1k, 2k, or 4k texture. 2k is only 4,000,000 pixels, so you’re going to lose detail regardless.
Low poly models rely on basic blending of vertex normals from the model. It’s pathetically basic, actually: the normal of one vertex is linearly blended to the adjacent ones. Yes, the blending from one vertex to another is linear. This is why a round object does not look perfectly round, but instead pinched at the corners. The more vertices, the more accurate your model’s normals will be, and thus the higher quality your bake will be as well.
The bake also does not take into consideration any Z-value difference, only X and Y in tangent space. You need to render a heightmap from Z-Brush and use Parallax Occlusion in Unreal to get a more accurate result. But even then you’re not physically correct because parallax occlusion, in its current state, does not silhouette properly. Pieces that stick out will appear to stick out within the surface of the model, but at a glancing edge you won’t see that piece physically levitate off the surface.
And just so you know, UE4 is a physically based renderer. Plugging 40 in Specular won’t do anything. If you want your model to be shiny, I suggest lowering the roughness to somewhere between 0.1 and 0.2, maybe raising the metallic to 0.7-1, and leaving the specularity alone.
Some additional notes:
How close to the thing will the player be in the game?
How fast will it be moving/animating when looking at it?
A screen only has between 1 and 8 million pixels (with 2 million being the typical number) so a 17 million triangle mesh is way overkill, even if the object covers a large percentage of the screen!
Try less than 50,000 polys, even for things that cover a quarter of the screen.
(Unless your game is about detective work that requires meticulous investigation of objects in close-up, or it’s a museum-exhibit type of application)
Separately, the resolution of the normal map absolutely matters. You should be rendering 4096x4096 normal maps if you care about high detail. Although, again, if you have good texture space usage, only the highest-resolution displays will be able to display that kind of detail, so cutting it down to 2048x2048 for game is usually fine.
Other maps add to the sense of detail, too. For example, ambient occlusion maps are quite important! Make sure those get rendered out, too.
Here is the whole model. Totally it contains 17 million polygons. I marked the separate parts.
In ZBrush every part is handled with ZRemesher which automatically makes a low poly version of a high poly. In ZRemesher Target Polygons Count is set to 5, which means every low poly part will have 10k - 30k polys depending form of a part. Then the model is taken to Blender and there whole model is set to Smooth, without any specific Smooth Group work. Every part will have 2048x2048 or 4096x4096 size textures. Do u guys think UE4 can handle three 4096x4096 textures per part, Normal Map, Diffuse and AO?
The game itself will be an action shooter like Hexen, Quake 2, DOOM and so on. Little bit Dark Souls to have more difficulty and interesting enemy maneuvers, but no RPG elements.