Community Tutorial: Baking out vertex animation in editor with AnimToTexture

This is an overview and introduction to AnimToTexture, a plugin that can render costly Skeletal Mesh animations into hyper optimized Static Mesh animations.

https://dev.epicgames.com/community/learning/tutorials/daE9/unreal-engine-baking-out-vertex-animation-in-editor-with-animtotexture

4 Likes

Thank you for the great tutorial I always wanted to get into it but because of lack of a start guide I was hesitant to try it out, I gave it a try and it worked pretty well although one issue I’m facing is when I try to share bone animation between 2 meshes with same skeleton it just doesn’t work, are there any requirements for the meshes? I also tried applying the same material instance to the other mesh but its a blob mess so I’m sure it isn’t a material parameter setting.

I’ve tried the Houdini VAT and also the 3ds max VAT but never got clean results. I’m happy that something at last works.
The only thing that i couldn’t find a solution for is if i can also include the blend shapes/morph targets into the bake. In your documentation it says that using the vertex mode will record each vertices through complex animations like morph targets, but i could only output the Bone animation. Do you have any info about whether that is possible?

ezgif-3-38725850b6

the left shape has a simple additional blend shape that squeezes the object

All very interesting, but under a pipeline standpoint, being limited to a 3dsMax script or an experimental feature is quite frustrating. We can use Houdini to do this, Blender is out of question. So what I would really see being useful information to be posted about this feature, and would really appreciate it, is to see the technical specifications needed to recreate the setup in any software of choice.
Say that I wanted to make my own exporter for Maya.
What are the assumptions?
What what is the data to pack, how should it be packed, on which uvSet? How should the uvSets made like?
Providing some basic, generic information would really be way more useful than providing an experiemental plug in for in-editor exports.
Some of us could make a python tool, others may be able to turn it into a Maya C++ plug-in.
Documentation on the specs guys! Pretty please!!!

Thank you for sharing.

I haven’t gotten this far myself, but individual meshes will need their own unique BoneWeight textures. And because this all acts on that special UV set, each mesh needs to get passed through the whole baking process. My example scripts do not account for this, but you could duplicate a new Bone Weight texture and reassign that and the new mesh into the Data Asset and then rebake the Data Asset with the Scripted Action.

hmmm, I only use the Bone method myself, so maybe I overstated the capabilities of Vertex mode. There was an earlier version of this plugin in the CitySample project (it is still there) that I believe had an input for an Animation Blueprint to handle more complex blending of animations. I will check internally on this aspect of Vertex mode and maybe I’ll need to edit the article.

I think you could still layer in another morph target effect into the master material, but I’m not yet sure what complications will arise.

That’s a valid point and I wish there was always more reference material available for every feature. Unfortunately it’s very early in the lifecycle of this feature and there is no documentation for experimental plugins. The only option for your example is to look at the source code and automate from there.

If you truly want to build your own pipeline for this technique outside of the engine, there’s a great talk given by Mario Palmero.

Thank you for sharing this tutorial! Maybe someone has ideas or examples of how to implement a blending/interpolation from one animation to another, like from idle to walk to run?

1 Like

@Stephen.Phillips this is fantastic, wonderful breakdown of the new plugin in your doc! This seems to have gone unnoticed by many with so many new features being added so fast - it needs more attention as it is so powerful!
Thanks a lot for taking the time to share this, AAAA doc!!
(Writing tut docs is not easy for anyone who has tried LOL)

1 Like

Hi @Stephen.Phillips and thank you for this. For some reason I can’t download your sample project, is there any chance you could re-upload the files of put them somehwere else?

Thank you,
Luigi

also, does it work with HISM’s?

1 Like

Yes. It works great with HISM and Per instance custom data.

how to implement a blending/interpolation from one animation to another, like from idle to walk to run?

And the answer to my question: you can blend animations using Lerp_3Color (or presumably Lerp) in Material Layer

1 Like

thank you for confirming @aelmod

Could you please expand a little bit more on the animations blend?

If I bake, say, 3 animations for sitting idle, 3 standing idle and 3 standing dancing what’s the best way to pick only one “motion group” and later blend between standing idle and dancing for example?

Thank you,
Luigi

You need to implement a custom animation manager that will change the material layer parameters as needed, perhaps using a state machine, something like how normal animations work.
Here is a simple implementation with hardcoded data. I did this a long time ago, there is poor documentation and terrible spaghetti, so if you have questions feel free to ask me.

1 Like

Thanks for the heads up. I have a busy week this week but will try to take a look.

Hi @aelmod thank you for your help, I will take a look at your file.

Thank you @Stephen.Phillips

I’m getting bad results with this. The positions are fine, but the normals are not correctly reconstructed. This happens in both vertex and bone mode.

Has anyone seen anything like this?

@Stephen.Phillips Thanks for great explanation!
What hardware do you use? In my PC (i7-6700, 32gb Ram, gtx 1070 8 gb) I got 20fps bounded by GPU (UE 5.2). Is it okay?