We’re working on a cloth simulation pipeline in Unreal and running into iteration and data transfer pain points. Here’s our current setup and where we’d like to improve:
Current workflow summary:
The render mesh is created in Maya.
A cloth proxy mesh is generated from a reduced version of the render mesh (to keep the simulation lightweight).
We further decimate the proxy for different proxy LODs.
Both render and proxy meshes are imported into Unreal as skeletal meshes.
On the proxy mesh, we create cloth assets for each mesh element (e.g., pouches, belts, trinkets…), tune simulation parameters, import masks from vertex colors, and set material-related properties manually.
We then manually import the proxy LODs, remap parameters, and finally import the proxy setup to the render mesh.
If the render mesh is updated (e.g., skinning or topology changes), we must redo everything, recreate all cloth assets, reconnect LODs, reassign elements, and reimport vertex masks.
And similarly, if we adjust the cloth proxy (e.g., tweak vertex paint or simulation settings to improve behavior), we also need to reimport and reassign it all to the render mesh again.
Pain points/goals:
We need to auto-import and assign vertex color masks to cloth parameters (e.g., backstop distance from red, max distance from green), ideally without manually linking each target in the Cloth Editor.
We’d like to automatically create clothing data per mesh section based on material slots, rather than doing it manually, section by section.
Most importantly, we need a more non-destructive and iterative process, so that updated render or proxy meshes can sync without having to rebuild all clothing data and assignments every time.
Questions:
Is there a supported or recommended way in Unreal (5.7) to automate cloth data generation and vertex mask import through Python scripting, Editor Utility Blueprints, or the Cloth Editor API?
Is there a way to rebind updated render/proxy meshes without regenerating all clothing data from scratch?
Has anyone implemented a workflow for cloth data persistence or reimport sync between Maya and Unreal that avoids having to redo LOD assignments manually?
Any insight, Python samples, or recommended editor scripting entry points would be hugely appreciated!
A cloth proxy mesh is generated from a reduced version of the render mesh (to keep the simulation lightweight).
We further decimate the proxy for different proxy LODs.
Both render and proxy meshes are imported into Unreal as skeletal meshes.
On the proxy mesh, we create cloth assets for each mesh element (e.g., pouches, belts, trinkets…), tune simulation parameters, import masks from vertex colors, and set material-related properties manually.
We then manually import the proxy LODs, remap parameters, and finally import the proxy setup to the render mesh.
If the render mesh is updated (e.g., skinning or topology changes), we must redo everything, recreate all cloth assets, reconnect LODs, reassign elements, and reimport vertex masks.
And similarly, if we adjust the cloth proxy (e.g., tweak vertex paint or simulation settings to improve behavior), we also need to reimport and reassign it all to the render mesh again.
Is there a supported or recommended way in Unreal (5.7) to automate cloth data generation and vertex mask import through Python scripting, Editor Utility Blueprints, or the Cloth Editor API?
Yes, legacy Clothing Data is being slowly replaced by the new Cloth Asset which is Dataflow graph based, and is designed to help with non destructive workflows.
Is there a way to rebind updated render/proxy meshes without regenerating all clothing data from scratch?
Yes using the Cloth Asset workflow. Instead of creating a legacy clothing data, click on Add and select an existing Cloth Asset instead (experimental in 5.7).
You’ll then be able to continue editing the asset using the Dataflow Graph without having to constantly rebuild the simulation data.
Has anyone implemented a workflow for cloth data persistence or reimport sync between Maya and Unreal that avoids having to redo LOD assignments manually?
Yes the Cloth Asset workflow with the latest Skeletal Mesh Clothing Data integration (experimental in 5.7).
Any insight, Python samples, or recommended editor scripting entry points would be hugely appreciated!
There is no Python integration or scripting available yet. Our efforts are centred on the use of the Dataflow Graph and Integration of the new Cloth Asset into the SkeletalMesh editor workflow.
There is currently no vertex color import feature in the Cloth Asset yet, but it is not hard to add it yourself. I can give you some pointer if necessary.
Sorry, I don’t have a link to the change to make, but I’ve made a few screenshots of the diffs of the current proposed change which should still help.
In /Engine/Plugins/ChaosClothAsset/Source/ChaosClothAsset/Private/ChaosClothAsset/ClothGeometryTools.cpp: FClothGeometryTools::BuildSimMeshFromDynamicMesh() is used in both the StaticMeshImport and SkeletalMeshImport node that you will need to modify to create the 3 weightmaps.
Same file, in BuildIslandsFromDynamicMeshUVs() you need to read the color overlay and store it into the Island. (struct FIsland needs a new TArray<FVector4f> Colors;)
In /Engine/Plugins/ChaosClothAssetEditor/Source/ChaosClothAssetDataflowNodes/Private/ChaosClothAsset/ClothDataflowTools.cpp: In FClothDataflowTools::AddSimPatternsFromSkeletalMeshSection() update the dynamic mesh conversion code.
Finally update the calling points in /Engine/Plugins/ChaosClothAssetEditor/Source/ChaosClothAssetDataflowNodes/Private/ChaosClothAsset/SkeletalMeshImportNode.cpp and /Engine/Plugins/ChaosClothAssetEditor/Source/ChaosClothAssetDataflowNodes/Private/ChaosClothAsset/StaticMeshImportNode.cpp
Note the above code change doesn’t implement the UVUnwrap code for simplicity, so it will only work if your meshes already have UVs!
The Weighmaps will then be added by the import nodes under the R, G, B, and A names which you can then assign into a MaxDistance or any other simulation config node.