hello im not really sure if this is the place for asking any plan in the future but im just wondering ifs theres any plan on making ml deformers available for blender anytime soon? i think it would be a wonderful addition for blender to unreal workflow
Hi
The ML Deformers are independent of the software used. For example we have been using it in combination with Maya and Houdini. But you can also use Blender.
Our random pose generator is only for Maya though. But with the Local mode of the Neural Morph Model in 5.1+ you do not need to use random data per se. Or you can generate random data in Blender using a custom script (or another plugin) and export that.
I have also written some more information in the following thread:
How does the internals of the pose generator work? Lacking something like Maya, I am unable to generate the training data in Blender since I’m not even sure what the pose generator is even generating. Is it creating new weights for the vertices, or is it simply shifting the vertices disregarding the bone weights?
From my limited understanding, having played around with the ML Deformer a little with Daz characters, is that it’s doing neither.
What you ultimately need to use the ML Deformer is a linear skinned version of your character in FBX format, and an Alembic cache of the same character. Both need to have the same vertex count and I believe also the same vertex order.
Both of these also need to be going through the same range of motion (ROM), which is basically an animation that puts the character in numerous different poses.
The FBX version obviously won’t have any detail such as joint corrective morphs, muscle definition morphs etc included, but the Alembic version should have these baked into it.
I believe that all that the included Maya script is doing is putting the FBX skeleton/character into thousands of random poses, based on the rotation limits of each joint in the character, which you have to define yourself.
In Maya you would use the FBX skeleton to drive a copy of the same character, however this copy would be rigged with things like blendshapes etc to apply things like joint corrective morphs onto it.
The script doesn’t do anything to generate blendshapes etc for you. You’d need to set all of that up yourself.
The script would then export the original character and it’s animation as an FBX, and the copy as an Alembic cache animation, with all of the corrective morphs etc baked into it.
I used Daz Studio to export both the FBX and Alembic cache of a character, both going through a range of motion animation which I just created on the timeline in Daz.
The only catch with that Daz method is that the vertex ordered between the FBX and Alembic cache did not match, so I ended up using Houdini to correct that, but I’m sure it could also be done with Blender.
Here is a discussion on the Daz forum where a couple of us were experimented with it. I believe someone on there was looking into getting a similar workflow working in Blender.
If randomly generating poses is all it does, why isn’t this functionality integrated into Unreal so that people can export a set of random poses to whatever 3d software they’re using? Maya might be common, but most people are using something else, and expecting everyone to have the scripting knowledge to write their own plugin for whatever software they’re using just to access this one feature isn’t reasonable.
I’ve been excited to use this feature since it was announced, and I’m still waiting to be able to use it despite it being out for years because 3dsmax isn’t supported out of the box, and I don’t have the time to learn maxscript and build my own pose generator.
Please either provide the ability to generate this animation within Unreal itself for export, or provide a script for other major 3d softwares.