Sprite model? How did they do this?

I agree it may not make sense, but this effect may still have been implemented to reduce the artifact of aliasing. Sometimes you can even do things like increase the thickness slightly with distance to decrease aliasing etc.

Good to know. Thanks for the tip Ryan!

Could you show us how the connections are made? The documentation is very vague and I tried using it in the material editor but the results were so weird.

That’d be lovely, thank you!

Hi Ryan,

Is there any chance you can ellaborate on how to use the SplineThicken node? I mean just on a super basic level? So far i’ve just tried popping in randomness in the input pins, WidthBase, WidthTip both to ie. 100 just to see what they did. WorldPosition, do i need to feed in the Camera Position there? UV’s for Projection and Thickness, what do i plug in there?

I made a thin plane and been experimenting with rotating my UV’s 90 degrees in 3dsmax to see what results i get, and either way i get weird results. I can post a super short video showing what i experience.


Sure thing. I just wrote up a more detailed doc for a contractor so I have all the images. This probably has more info than you need.

The basic idea behind spline thicken is that it is like a camera facing sprite that follows a path. The “path” is basically just a pre made row of thin hair like polygons. The vertex shader then extends these polygons in a direction that is always perpendicular to the camera direction so the mesh always appears to be of the same thickness when viewed from any direction.Here is a quick example I made using 3dsmax.

First I draw a line with a bent pipe shape. Then go to settings for the line and check both “enable in renderer” and “enable in viewport” and “Generate Mapping Coords”. Set sides to 4 since unfortunately that is the minimum and set angle to 45 so we can more easily delete the extra faces.


Next convert to editable poly and select and delete all but one strip of faces (this step is only necessary when using the line tool and can usually be removed from modelling workflow by other means)

The simplest method is to make the polys very thin, but you can also leave them at a specified thickness and then de-project in the material in UE4 in order to get better static lighting support but that shouldn’t be necessary for this project:


Finally you want to give each polygon 0-1 UVs by adding an UnwrapUVW modifier and stretching the UVs horizontally to fit 0-1. You can also adjust the Y to be uniform or just adjust it in the material later.


In addition to checking 2-sided also make sure to uncheck “tangent space normal” on the material (the need for 2 sided is just a bug, I believe tangent direction flipped since this function was made. you can also use negative width params but that will flip the normals):


For the material, if you are using the default projection along tangent U, you shouldn’t have to do anything besides this:


If you set it up right, it should match actual geometry almost perfectly, when the viewing angle is not parallel with the spline which causes artifacts:

You can do tons of cool things with spline thicken. Here is a prototype stony coral I made using a really quick crappy material:

This smaller one shows the wireframe:

And yes, all of this has been passed on to the docs team but I am not sure when they can get to adding it.

Re: your specific questions in the images:

The “Expand U or V” option allows you to expand the spline either along the UV.X or the UV.Y channel. The example above uses the X channel and I suggest sticking to that and leaving it alone.

WorldPosition input is only for very specific effects, such as when you want to apply other offsets to the spline before thickening. 99% of cases don’t need to hook anything to WorldPosition input. Certainly don’t put cameraposition there.

The “Width Base” and tip are how wide it projects the verts from the spline in world units. It will use the opposite of the proection axis in the UVs to blend between the two values. So for the default option where it projects using U (UV.x), it will use UV.Y to fade between width base and tip.

Absolutely awsome man! This was excactly what i needed :slight_smile: Thanks a bunch for all that lovely tech info too on how it works. Makes it easier to go back and read if something doesnt work.

That is truly amazing, I’m gonna go play!

Thanks for this thread and thanks for the answer all ! Very useful i think :wink:

I just got home from work and sat down and set material to 2-sided and then it worked lol. So that was the reason i ended up with tons of wires going in, in the hopes that i would get a good result. Also one thing i realize just now, is that it was a materialfunction and i could have double clicked it to see how it worked or atleast get an idea about it.

So one thing though which is now puzzeling me, is how the heck do you manage to control the thickness of your strips? I mean for building this kind of stuff, i need to be able to control it. I tried with a material parameter for base and end widths, but i would need one dynamic material instance pr. segment then , and change the parameter value to achieve this?

And btw, yes i was very much inspired by your Paragon POM/Produral vines stream when i made this :slight_smile:

As before, you use the UV.Y coordinate. So if a vertex has a uv.y of 0, it should use BaseWidth. If it has a UV value of 1, it should use BaseTip. You can apply a curve by putting a power on UVs and plugging that into ‘UVs for Thickness’.

If you want one of the sub branches to be less thick than the parent, you would need to make its base verts start above 0 in the UV.y layout.

This isn’t always that flexible, so for my procedural branching blueprint I did some custom stuff and also modified with using vertex colors which let each segment be any size but even that has some issues as there are only 256 available vertex color values. That means if you apple a unique vertex color to the start and end of each segment, some of them may eventually stair step. for that reason I tend to use a combination of the UV control with the vertex colors of modification.

Ok i see… but since i deal with small strips of 8x1 plane segments, which are replicated layed out along the path of the spline as splinemeshes, they will all individually have the same uv space, so to become thinnner towards the end, doesnt that mean i need to somehow read it’s position along the spline and normalize that into a uv.y value? Or am i misunderstanding this?

If you are actually instancing them then yes. If you are making it by hand in 3ds max you can just stitch them together to form a line so they dont all have the same UVs.

I ended up doing a slightly different solution when using my procedural branching blueprint. basically I encode segment length along the whole tree into one of the vertex colors. Then it uses that information to offset the UVs of each segment so that the UV tiling can be changed to be whatever and still be seamless between pieces. And then I also encoded the branch with directly into another vertex color. Have you tried using the blueprint node “paint vertices lerp along axis”? It is perfect for this kind of thing.

You can also do stuff like use the overall object height gradient and perform slight offsets using the vertex colors. For some things thats all you need. It would at least make sub branches the same branch as the part of the parent that it came from, unless there is more horizontal branching than vertical.

Tbh. im not that advanced with materials yet, so no i haven’t tried using “paint verticies lerp along axis”, but from the name of it, it sounds like it could very much be usefull in this situation. I will investigate this. That other stuff you write about encoding, im sure i’ll understand later, but hopefully someone else can make use of it. In the meantime i tried out a different approach and wanted to hear your oppinion on that. I do realize this will not work with a merged actor, and then i would need to take care of the thickening like you do. But instancing a new material like i do with each Splinemeshcomponent, is that a total no-no?

Its not the worst thing in the world from a prototype perspective but I wouldn’t go shipping any games like that since it won’t be very efficient.

If all you are doing is making the sub branches start a bit thinner, you can simply paint a solid color on each segment that represents how far along the parent it was when spawning (you will want to invert it by doing 1-x once its in the 0-1 range). Then you just multiply the width param by that vertex color. Should actually be simpler than making separate MIDs per branch.

I kinda knew you’d say that hehe. That’s why i wanted to convert this to a static mesh without all the generation logic and material logic. So the thing is, i do understand the concept you say about painting a solid color for each segment as a representation for how far along the parent it is, but i just don’t know how to do that. And on top of that, my UV’s for the strip i use, is the same for the generated merged Static Mesh, so it’s not like i get unique UV’s for each face after the merge. Did that last part make sense or?

Yes I had the same problem where each segment has the same UVs.

To paint the solid color you just do “paint vertices solid color” (name may not be exact but my editor is stuck saving a huge map atm). You would need to know how far the current parent segment is going to get, and then track the local iteration within the branch. Then you just divide the 2nd number by the 1st, do 1-x and use that on the paint node.

Then you can reference the vertex color node in the materials and use that to multiply the width values. you could also do power etc on the color to make it non linear.

Ah i see the idea yes. But won’t this create some substepping, since im painting with a solid color? (Paint Verticies Single Color) Anyway, im gonna try it out and see how far i can go with this.

Doh! So using Paint Verticies Lerp Along Axis as you said, allows me to set a start and end color yes, and even the Axis. Think i can work this out :slight_smile:

Yay, i got it to work. Thanks again Ryan for all the patience with me. I ended up doing excactly as you said in one of the first posts, to use the “paint vertices lerp along axis”. Just had to read a bit up on vertex painting.

Im gonna talk about this on one of my upcoming videos and share what i have learned today.

How can i get it to Align with the next Mesh? @RyanB when i start to move around it doesnt match anymore