If you’ve been waiting for a Blender version of the Vertex Animation max script you’ve come to the right place! I’ve recently been working on one, and have a pretty solid version. As of now the script only has the timeline animation functionality, but I plan to remedy that soon. I’m new to scripting so I’m sure it’s not very pythonic or elegant, but it works. In fact this is my first attempt at creating anything remotely complex. I’m assuming anyone coming here knows how to install or use scripts in Blender. If you have any questions feedback please feel free to leave it here or PM me. I plan on trying for a full suite of tools like something similar to Pivot Painter and maybe even the older Morph Target script. I’m open for suggestions as well. But to get right to it I’ll be posting the GitHub repo below and quick tutorial.
The new panel is located in the object mode tool shelf.
Make sure your timeline settings are correct and the objects you want are selected. Don’t worry about selecting anything other than meshes the script will ignore any object without mesh data.
After pressing the button (the operator can also be executed with the space bar and searching for “process animated meshes”) the script will create a single mesh called “Export_Mesh” from all meshes combined with correct transforms from frame one.
The script also creates two textures new textures one “morphs” and “normals”. They can be found in the UV/Image editor. You’ll need to save “morphs” as an OpenEXR with RGBA and 16 bit depth in other words check Float(Half) next to color depth.
To save “normals” is even simpler. You’ll need to save it as a BMP or bitmap with RGB checked.
Now all you need to do is export “Export_Mesh.” I’m assuming you already know how to export a static mesh from blender for UE4 if anyone has any questions just let me know. Just make sure you create a material slot for it.
Time for Importing!
Your settings for your static mesh should look like this
I was going to explain the whole material setup, but I’ll just link to the original tutorial for the max script. It explains the process far better than I have the time to atm. Just go here and follow the material setup portion.
You have no idea how happy I am to see that someone’s made this, especially since you plan on working on even more. It’s always nice to see Blender get support from people. Keep up the amazing work!
All that being said, it seems I don’t actually know how to do the animation part in order to actually use this. Namely because I’ve only ever done skeletal animation. So if anyone happens to have any resources I could look at, that’d be great. The GIF you have at the end there looks really good.
Edit: I realized I was just being dumb, I was able to get an animation that was made with a skeleton in there. After running into a couple issues (on my end). It seems to be working as expected, which is great.
It’s good to hear it’s already being put to use! I should have checked using skeletal animation to begin with. I’m glad it works. I had just assumed because it worked with shape keys (Blender’s morph targets). That’s how I went about the animation in the gif. They’re just being scaled up on a curve using a curve modifier. They also have a shape key for the cap opening.
Hey if you never ask questions you only make it harder to learn anything. Morph targets can only be used with skeletal meshes. This is all driven in a material using world position offset on a static mesh. This is useful because it can be used as mesh data in a particle system. Whereas skeletal meshes can’t. Morph targets are also only like snap shots or just key frames out of an animation that you linearly interpolate through. This is a fully animated mesh. Whereas normally this could only be accomplished with bones. Hope that helps some!
The information baked on UV channel 2 is based entirely off how many vertices there are, correct? (basically, it’s the thing that lets each track control a vertex) I’m only checking because I was planning on using it for multiple animations on the same mesh, but had forgotten about that aspect; looking at it (and to a degree the code) seems to imply that that’s what it does, but I’m not absolutely certain, so confirmation would be great.
Would it be possible to modify the script to keep the UV channel 1 data from the original mesh? Because as far as I can tell, it doesn’t keep that information.
Inferno630 to your your first question. Yes, the script takes every vertex loop and spreads them out evenly across the V or Y axis from right to left, for whatever reason they had to go from right to left. This is why the image width is also equal to the number of vertices. So each vertex is sampling a pixel that’s giving it X,Y, and Z offsets through R,G, and B. You should be able to use multiple animations. You could probably use the same image just turn multiple animations into one big one. Then do something with material instance parameters to pick where at in the animation you want to play, or in this case the list of animations. Or you could just use the script multiple times to produce several images and just use parameters to switch between different texture objects.
To your second question. The UVs in channel one should be saved even if you’ve combined several meshes they should have just been combined as well. So that if you had copies of the same mesh, like I did in my example, if you went to the UV editor all the island copies would be there stacked on top of one another. It’s one of those things I’d looked into adding as a feature but just worked as a result of the way I’m coping mesh data.
Yeah, no problem. I need to comment the script out a bit better. Well a lot better to honest lol. Maybe add a little more to the ReadMe. If you have any more questions or suggestions for documentation let me know. I’m thinking of just making a little video tutorial. Maybe a quick animation tutorial appended at the end to get the same results. It’s definitely a relief to hear you’re getting UVs in channel one. I thought I had a big bug on my hands. Though I was trying to get a replication and found that if one mesh has UVs on channel 1 and others don’t Blender will crash. So there’s that… It’s an all or nothing kind of thing.
I’m not sure about performance trade offs or anything. The point of this tool is for using the static mesh in a particle system. If you’re importing the Alembic file as a static mesh (not as skeletal or a geo cache) I’m assuming it would be similar. This is all driven by textures in the meshes material though which could have parameters exposed in cascade and or blueprint. I’m not very familiar with using the alembic format though so I don’t know if you could achieve similar results.