Substance, Quixel and procedurally generated textures for full 3D models

Hello,

My problem is extremely simple, I don’t want to spend time making beautiful 3D models which are artistically textured. I am artistically impaired, and it’s a huge time sink to learn all the tools (Making high poly 3D models, making the low poly, baking, editing normals, making textures, making layers, creating maps upon maps, such as AO, displacement, subsurface normals etc…)

My dream is the following : I start with a 3D model, which is not even textured in any way (but has a normal map, thus has been UV unwrapped). I select some faces, and say “This is Material 1”. I select some other faces, and I say “This is material 2”, etc… When I am done, I just apply a material from a database to each of these parts.

So far, it looks just like something that can be done in UE4, right?

My main problem with the current workflow is the following, applying materials only work as long as you don’t have any details. For example, when you try to apply a “brick material” to a mesh in UE4, you will observe massive discontinuities, especially where the UVs don’t match. You can see an example here :

Of course, I could spend a lot of time having careful UV unwraps (note : I wouldn’t even know where to start) but I simply don’t want to do it. I don’t have the time. This is the exact contrary of me telling “This is material 1”, all the details I thought I could skip, I jump right back into them

However, when I looked at Quixel’s suite, consider how surprised I was when I saw this magnificent example. From what I understood, this is a procedurally generated material, which has the very interesting property to be a metal which is painted green, but on the “external” edges of your 3D model, the paint is gone and you can see the model underneath. My hope is that this has been fully computed from the 3D model.

I come here for the following questions :

  • Have I been tricked by the Quixel video? Was there some textures I didn’t see which detail where the “external” edges of the 3D models are?
  • If not, does Substance also do the same things?

From what I have seen of Substance, I have glorious procedurally generated 2D textures, but I still have my discontinuities.

Last thing I wanted to write down : I understand that there is no mathematical solution to mapping a 2D texture to a 3D model (as long as the 3D model is not homeomorphic to a plane). But we are here in the realm of illusions, I don’t care if the procedurally generated textures are hacks which blend things or whatever. I just don’t want to take care of that and have a decent looking result

Thanks for any clarifications!

You are going to want to do a UV unwrap at the very least, otherwise the texture won’t tile or flow on your surface correctly at all.

Quixel is procedural, but you have to set it up to be like that. Basically, you render things like your normal map and AO map, for an object that is made of different materials you would also create a map that has a solid color defining where you want materials to go (like red is metal, green is leather, or whatever). It then uses all those texture maps to be able to make the texture details like scratches and dirt. You still have to do UV mapping, you still have to do some texture setup.
There isn’t any solution for any software where you can simply apply a material and not have to do anything else. The are alternatives to UV mapping like Ptex–but, Ptex would not work with tiled textures.

Part of the problem you’re seeing on the sphere there is because it isn’t UV unwrapped at all. All of the meshes in the geometry tab assign UV seams to each individual face, so far as I can tell. Part of the reason for that is you can adjust their parameters, like how many tesselations the sphere has, how many steps the stair cases have, etc. If you go down to your content browser, though, and look in the shapes folder, you’ll find a different sphere. This one does have mapping data, and if you apply a material to it you’ll find that it behaves quite differently. That being said, because you can’t select single faces of that mesh, I don’t believe you’d be able to apply multiple materials to different parts of it inside of UE4’s editor.

https://dl.dropboxusercontent.com/u/12436907/satyrpics/spheres.jpg

I haven’t used Quixel, but I have used Substance. Like darthviper was saying, in applications like substance, you need a map, usually a color map, to tell the application where to put a specific texture. Without it, the application will apply a single texture to the entire model. In Substance Designer, you can bake that out based on the model’s UV shells.

I’m fairly new to procedural texturing myself, but here’s what I understand so far. Let’s say you have your model, both high poly and low poly all set to go:

https://dl.dropboxusercontent.com/u/12436907/satyrpics/satyr_zb_thumb.jpg

So you take your low poly and unwrap it: (this unwrap is terrible, by the way. Didn’t have anything readily available)

https://dl.dropboxusercontent.com/u/12436907/satyrpics/satyr_max_thumb.jpg

And then bring that into your texturing application and do your bakes. Below is a UV to color SVG, which will be used to make color masks:

https://dl.dropboxusercontent.com/u/12436907/satyrpics/satyr_sd1_thumb.jpg

And the mask. I haven’t applied any textures, and just plugged the mask straight into the diffuse channel.

https://dl.dropboxusercontent.com/u/12436907/satyrpics/satyr_sd2_thumb.jpg

From there you would use a multi-material blend and your color mask to apply multiple textures to a single model:

https://dl.dropboxusercontent.com/u/12436907/satyrpics/satyr_sd3_thumb.jpg

Thanks a lot for the thorough answers, especially cynicalcoffee.

I am a bit sad because it seems there’s still a lot to be done, at least for people like me.

The reason I think UV unwraps are not necessary comes from the following :

  • Textures in texture databases are given in the form of 2D rectangles. They often tile
  • Texture for 3D models have the form of the UV unwrap you computed.

What we do is the following, we crop the 2D rectangle texture with the form of the UV unwrap, and hope for the best. If the result is not good enough, then we rework some details here and there. For example, I am 100% sure that cynicalcoffee’s model has discontinuities on the back (and the back of the legs/arms) (no offense, you did a wonderful job) The sphere shape in UE4 is proof of this, it’s UV unwrap has the form of a 2D rectangle, which makes the application of the 2D rectangle texture perfect. But most of the times, you can’t UV unwrap so that your UV unwrap has the exact form of a rectangle

My hope was that someone would come up with good algorithms that generates a texture of the form of the UV unwrap (rather than what is done, UV unwrapping so that it has the form of a rectangle). Of course, you can’t generate every single texture this way, but with PBR and materials, it was my hope that someone could generate something like for example painted metal. Because it is possible to compute what part of your mesh has “external” edges where the paint is gone. Of course, with normal maps you can do even better

Thanks a lot, it was very interesting, but it seems I don’t have what I want…

Not really. That’s where you see the advantage of Substance, or painting your textures in Zbrush for example. Substance lets you paint on the mesh and continuously across the edges and that eliminates texture seams.
There is no any other or easier way of UV mapping and texturing models other than what has already been told in previous posts, so you’ll have to get used to it(or find the magic solution if you are a programmer and the humanity will be grateful for your contributions for eternity :slight_smile: )

If someone could come up with a procedural algorithm to make unwraps into perfect squares without distorting, everyone’s lives would be much easier! The reason that’s unlikely to happen, sadly, is that the UV map is a literal interpretation of all the model’s faces in 2D. Each 3D face is being squished down and then relaxed to try and get as close to square as possible so that the texture won’t stretch and distort across the model. If you did force to UVs into a square shape as a whole, that distortion becomes inevitable, and you can actually see that in the sphere. As a clearer example, here is a sphere out of 3dsmax with its automatic mapping, perfectly square, and a checker pattern turned on.

https://dl.dropboxusercontent.com/u/12436907/satyrpics/sphere_unwrap2.jpg

Like mentioned, there are ways to get around this. Just because your UV’s aren’t perfectly square doesn’t mean you have to end up with bad looking textures. You can paint across them by hand, and you can also take the practical approach of placing UV seams where they would naturally occur on an object, like where a change of material occurs, an actual clothing seam, or where they’ll be hidden or obscured by another object. Most of those really cool, slick looking models you’re seeing on Allegorithmic and Quixel’s sites are broken down into a ton of smaller UV shells. Each individual UV island is devoted, more or less, to a specific piece of the model.

Another option is to find someone to work with that is a texture artist.

I know in a lot of pipelines the person who models the asset does not UV unwrap , Bake maps or texture it. (Baking maps is pretty simple I use Knald or Xnormal or nDO)
As a Modeler your job is to to make sure the model has proper topology for the project / Animation That’s pretty much it.

So if you can find someone who does texturing you could work together. (I know its not always easy to find someone)

I just found someone who does voice acting, Concept art and Music / SFX (Things I know very little about) Everything else Im learning on my own.

I know this is not how all pipelines work but I know that some do.

I would ask around here and polycount forums and any other places you can think.

Good luck

If I come across someone that does texturing I will PM you.

There isn’t any way to do an algorithm, PTex gets rid of the Unwrapping problem, but you can’t use Ptex in photoshop—in case you’re wondering what PTex is, it takes out each polygon individually and each one gets its own texture map, so there’s no distortion and it’s good quality. But, using that means the only thing you can do to create textures is paint directly onto the model. Programs like Substance and Quixel are designed to make it easier to get those details like corner scratches and things like that, but they work off of a UV mapping–basically the details they add wouldn’t work because of how each polygon is separated.

The other thing with Ptex is I don’t think it’s currently supported in any game engine. While you could use it to texture an asset, you’d still need an unwrapped model to bake that texture data onto so you could actually use it. The bonus to that being that your unwrap can pontentialy be much less complex. My understanding, though, is that Ptex is much more targeted at film, and not real time rendering.

PTex is ideal for situations where you’re painting directly on a 3D model rather than a 2D texture map in Photoshop. These days with 3D asset creation it’s primarily painted in 3D, but for games it isn’t so much. Not that they can’t, PTex would still be great for things like characters, but for instance with how Unreal Engine does static lighting it wouldn’t work very well.