Newbie UV textures not working correctly

I’m new to Ue4.25, but not new to cgi rendering and animation. However, UE has me scratching my head as it doesn’t seem consistent in what it does and very over complicated to do simple tasks that I can do in 2 minutes in other cg programs. I know some of this will be down to learning a different approach to what I’m used to, but here is my newbie problem.

I’ve created an object in my 3d software, 4 sided and a top and bottom. I’ve given each side a UV map and applied a texture map to each. It renders fine in my 3d software but when I send it over to UE some of the UV maps get messed up.
The front of the object works fine, I can drag the texture onto the front of the object in UE and it maps correctly, so does the back. However, the left side, right side and top are completely wrong. In the viewer it looks correct, but in the layout it looks like the UV coordinates have been changed as the image is drawn into stripes.
Can anyone help on what I’m doing wrong? Is there a simple way to change the UV mapping on a particular face?

I agree with the observations of Unreal. The problem, though, has me thinking of how Unreal utilizes texture space by default, which could also be how UVs are processing too. Basically, it starts in tangent space. UE’s coordinate system is different from certain external apps too, such as the forward-facing (default camera facing) axis is positive X in Unreal, while in other apps it is positive or negative Y. It may not seem it’s a problem, as 3D models look correct for some imports and not for others, all with the same reference coordinate systems being used between the programs. However, it depends on the mesh and material / texture if those also get imported and not made in Unreal’s material editor. Blender’s export ‘pipeline’ has a default of Y up (vertical, which is Z in UE) and -Z forward. In essence, when exporting try to set it to Unreal’s coordinate space before import to Unreal, or IWO in the external program. Try to not rely on UE’s import options for transforming to UE coordinate system. UV generation or interpretation by the engine uses the coordinate system in the resident app (the one used to generate or interpret UVs), so a model with UVs and textures applied is getting re-interpreted in UE coordinate space…which is tangent space by default for materials. Someone in the forum once told me normals don’t have anything to do with the UVs. This is not always the case. It’s essential to UV generation when rendering textures on a mesh too. Ray tracing is performed by referencing the normals in different ways.

Sorry about that diatribe, but it has to be correlated with coordinate systems that are different between applications. I’m not an expert, yet I see this kind of problem often, and encounter similar issues too. It’s even occurred on meshes and for textures that are not imported to Unreal, and are in the starter files. If there’s one suggestion besides trying to set the coordinate references correctly from export to import, it would be try different approaches even if one appears to be inconsistent with the other. In an example with Blender, change Blender’s export coordinate reference to +X forward and +Z up, and in UE during import, disable the ‘transfer to UE coordinate system’ option. Then, if it doesn’t work, or something is off, try it again with the UE option enabled. Additionally, try changing scale in the external app to be the scale of UE, and import without enabling UE scale in the import dialog. If it doesn’t work to get the texture properly applied to all sides, then enable UE scale of the same exported file. Then try it in the external app not changing to UE scale, and enable the transform to UE scale option in UE import…etc.

I think it’s something that should be addressed in UE. I’ve spent hours changing the way I texture something and uv it in Lightwave 3d, sent it over via UE Bridge and the textures get sent over but are not applied. So I manually drag them onto the object. 2 will work and the rest seem to map with the wrong coordinates. Yet if you view it in UE texture viewer it looks mapped correctly on a cube or sphere. There seems no way to correct any uv coordinate issues in UE either.
I can’t get any further in Ue4.25 just because I can’t map a simple texture to the roof of a model. It’s mapping the uv along x or z instead of Y.
Coming from other rendering software programs that follow a logical pattern, UE seems to fight the user and make things more difficult. You seem to have to jump through hoops to do simple stuff.
I know I’m a newbie and have to learn a different system, but I’ve not really found a solution online as yet to this problem to make it an easier work flow.