I wanted to create a skin shading inside Unreal. So i have my low poly model imported in engine, after putting it on scene (sorry if im using wrong definitions, applying screenshots) i started seeng some issues on the face (at least), looks like flipped normals, but i didnt see the same thing in maya, or default 3d viewer. Please, help
many 3D modeling software will treat planes as double sided, and particularly when you are doing things like extrusion and sculpting things the modeling software can kind of “forget” what the face orientation is supposed to be, in the 3D modeling software unless you purposefully turn on its version of “show face normal’s” the texturing system will often treat them as double sided plains and outside of very intricate high detailed textures you will probably not notice.
many if not most Game Engines to save on computation time, and to optimized occlusion culling will treat most all plains as single sided. When looking at a face oriented in the “wrong” direction will just be transparent/invisible.
at least in blender there is a view option to show color-coded face normal’s visualization where the best I can find for Maya is the line/hair visualization (Display->Polygons->“Face Normals”) the issue with this method is you kind of want to go inside the model to see if any of the lines/hairs points inside the model.
if this is the issue the solution is similar in most 3D modeling programs with selecting the faces at issue and “flipping the normal” (Display->Reverse)
another potential issue for “non-contiguous” models is that you might be encountering a secondary form of Z-Fighting/Clipping/interpenetrating where technically some parts of the model/mesh are overlapping themselves, and the engine is trying its best to not have them do that.
this can be the result of a few things, relative Transforms not being normalized before export (often times this is “Applying” the transform) particularly “Scale” is the biggest culprit, though position if the difference to the next nearest face is too close will also produce similar.
the steps to fix this can actually get into Modifying the Model. you want to avoid things directly overlapping if at all possible, even if this means removing what would normally just be “Occluded geometry”
maybe merging multiple parts together if possible
the final and technically most counter intuitive parts is to make move the outmost part more inside the part it is in front of (most rendering engines can handle things being inside of or slightly in front of, but they have real problems when they are “exactly” occupying the same space)
hey, thanks for your reply, im sorry, i forgot to mention that this is solved. heres my solution. It wasnt about normals, but about size of the mesh. At first i tried applying scale in maya it didnt work. Then tried to scale it up in maya in 10+ times . And it worked. Kinda strange, but it is something with importing normals in unreal. All import settings were by default, but scaling it up in unreal not the same as scaling up in maya. Im a niewbie to UE, so didnt know about those technical aspects. Thanks to everybody
Clearly dont remember when my model became that small, maybe after zbrush or retopo.
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.