Quixel+Blender+UE4 workflow

Hey Guys, Looking for advice here. I’ve been modeling for a while and am producing geometry along the lines of what I’m looking for for my game. I can export it as FBX, import the grey models in to UE4.I’m looking for advice on a solid overall workflow. Right now all I get from below is a white brick textured cube in UE4.


  1. Start with the Blender default cube which seems to be 2x2x2m.
  2. Go in to Bridge, export material to Blender. Lets say a brick material.
  3. Go to render mode, I see the brick is allied.
  4. Export to FBX. Only thing I change is I select ONLY the Mesh ‘Object Type’
  5. In the Content Browser, Import. I Get
  • Brick Facade Material
  • TextureTest Model
  • ukxlehdo_2k_Normal
  • ukxlehdo_2k_Roughness
    No albedo texture, so thats probably why my cube is white

Whats going on with the Albedo file… also the AO file is in the material in shading in blender, but gets dropped as well. What am I doing wrong?

Extra Question if you have time:

  • I noticed I needed to ‘Join’ all the objects so I imported just one Model rather than dozens of parts. Is this what I should be doing or is there a better way?

Kind thanks!

Hi OldChippy,

So to understand, you’re taking a model made in Blender into Unreal and it sounds like you are importing the materials/textures from Blender into Unreal, right?

I don’t enjoy this workflow due to differences between the software, I typically turn off import materials/textures and apply the materials made in Unreal, once it’s imported, to the model.

Unreal looks at each material applied to a model and gives you a material slot once imported.

Joining a model is a preference, if it’s easiest to keep it as one, then go ahead. Models can have one or more materials.

So, in short, make the material using those textures in Unreal, import your model without material/textures, assign the material(s) once imported. You’d be importing the textures in either case.

Kind thanks for responding. I didn’t even consider only adding materials in UE4 only and you understood correctly.

I took the time to give this a try including modifying the base mesh, then re-importing it with different geometry and materials and UE4 handled that find, even finding out what ‘was’ and now ‘is’ and advising the resultant changes (though it looks like I can’t edit it.)

Also as a test I decided not to join mat groups and found that I didn’t end up with a model soup as i did a few months ago when I last tried this. I suspect my first attempt had no material at all which created the mess, this time, 260 cubes, planes and circles turned in to a single model.

What I did do in Blender is create common name materials, Concrete, LVL, Plywood and Zincalume, as hints and assigned those materials in Blender. In blender they were just mildly different colours so I could tell them apart.

Kind thanks, this gave me a good result and TBH I really didn’t like Blenders complexity of applying materials as I had to render just to see it properly.

I’m not planning on doing any kind of texture painting (character, monsters, etc) due to my theme, but I expect that would require a different approach?

Kept playing with it…

  1. A bunch of geometry now looks like it’s now inside out. I doubled checked in blender, it still looks good there. Maybe Normal map has flipped normals?
  2. I have to work out where I’m going to apply UV’s. Long timber beams have long stretched textures. Will I have to apply the material in Blender so the mesh can save the UV coordinates with material which have the correct scale, etc?

Think of materials as an identifier. If you assign Concrete, this doesn’t have to be Concrete in the engine, but will show up with this name (which you can rename). It could be as bland as Mat1, Mat2, Mat3, etc. So it can be differentiated. Exporting as FBX will remember this which is what Unreal reads to correct what “was”.

This does sound like flipped normals, but normal maps have no ability to affect this. It may look fine in Blender, and I’ve not used this software, but in Maya for example, it will look fine there too but be flipped once in Unreal. Game engines typically render one face side whereas 3D software will be 2-sided (at least that’s default for Maya). There should be a way to turn on backface culling or 1-sided, not sure what it would be called in Blender. This should give you an accurate representation of what Unreal will import as.

Models hold the UV information within them. The model geometry must explain how a 2D texture should map to it - the material accesses the UV information. This is so the same Concrete material can be applied to numerous models, where the model UVs the placement of said textures used in said material.

If you are working with tiling textures, ones that can repeat endlessly without seams, you do not need to keep model UVs within the 0-1 UV space when unwrapping. They can be scaled as large as needed and can be verified by seeing the texture in Blender while unwrapping. This might seem tedious and there is a workflow that addresses a standardization to the process, concerning over texel density. This is quite a topic, but if it interests you, search online and get some knowledge on it. I am not saying it has to be done this way, but might be worth knowing about. In any case, most would agree that UVs tend to be tedious. Over the years, and with many unwrapped models, I have adopted the texel density workflow, but limited only to models which use repeating textures. I prefer to use high texture res for unique UV models, sizing down the texture resolution based on need (Unreal does this).

Common workflow:

Model. Apply all relevant modifiers.
UV setup for texturing.
Uv setup for light.
Name / define material zone using MI_matwrialname
Export to fbx.

Import 3d model & setup.
Color/Layout with custom (non quixel) textures (because licensing sucks).
Export final textures - set up packed texture output.

Create folder for model.
Copy material instance of master material and name properly.
Import model in proper folder - select do not create materials. Search folders for material.
Import textures in proper sub folder of model.
Modify material instance to use the imported textures.


You can substitute mixer for substance. Or anything else.

You may also want to pass the model into marmoset to generate depth/curve maps for the other software to use.
Even Zbrush is better than Blender at generating some maps you need to do the coloring proper.

The benefit of mixer so far, is that it unifies the final textures.

It’s a hassle and a pain to create your custom materials with your custom textures. But you only have to do it once and you can then export your custom library.
It’s even more of a pain, but you can also set up custom atlases by hacking the file formats code that’s in place - make or download a sample atlas, modify the definition file and make it use all of your custom files.

It isn’t great at painting. But it does allow for some painting.

Better than having to paint normals and roughness separately in blender anyway, since Blender doesn’t yet offer a way to paint a “material” across all textures in order to get to a unified result.

Kind thanks for the time taken to write this post and yes, I have found the 1 sided face problem already when I tried to cheat out a few polys by using planes for thin sheet mats like plywood. Something else I learnt recently was that Unreal has a geometry editor. I watched JoeGarth( guy) using it to make a house. I haven’t investigated the depth of capability in the tool. I don’t have my hopes up yet.

Kind thanks, I appreciate your post. The step by step workflow is exactly what I had in mind when I asked the question, though I expected someone would say something like ‘Always check the blah option’ or ‘make this change right before you export for FBX’ .

I need to investigate mixer a bit more. There is probably a bit more to your ‘I do this because’ that what you mentioned, so I need to go discover what that why is, including what the quixel licence issue is (I’ve got no plan of selling content, so maybe I’m clear).

You did bring to surface a question I removed from my original post (for clarity). Folder structure, both inside Unreal and on disk. I’m going to have to develop a workflow to keep assets in their respective places in the pipeline, or, I need to just store all the files in the same place(unreal project ) and only pull in the relevant files in to unreal. Then I also need to work out what I’m going to do with all this top level directory mess purchased products make. I haven’t yet tried renaming and moving things around but at some point I’ll have to. If you have a model that works for you I’d love to hear it. Right now I know I’m making a mess.

The default camera forward facing axis in Blender is different from the one in Unreal, and I’ve found it has an influence on the export > import results. Blender is exporting a model with its textures / material(s) according to Blender coordinate system and referencing, so on import into Unreal, there’s a few options for using Unreal scale (1 cm per 1 unreal unit) rather than Blender’s default (1 m per 1 Blender unit), using Unreal coordinate system (Z up instead of Y up) instead of FBX default (Y up)…also “Force Front XAxis” which I think aligns the mesh to the X axis from its original positioning in Blender or other software (so if the front of a vehicle was facing -Y in Blender it’ll now face X in Unreal?), and select Import Normals or Import Normals and Tangents. The last one is tricky, I think, because it could be better to let Unreal generate normals and/or tangents once the model with materials is inside Unreal and being lit in Unreal’s render space.

If you don’t see a “mixer” export button, do tell me I haven’t updated the files.

/Geometry/Architecture/Roman/Corinthian/Capitel/mi_corithinan_capitel < Material Instance
/Geometry/Architecture/Roman/Corinthian/Capitel/textures/ < Varaious .tga files
/Geometry/Architecture/Roman/Corinthian/textures/ < varaious .tga files


so on.