UE 5.5.4 ML Deformer GPU compatibility

Hello everyone,
I’m currently working with the ML Deformer in Unreal Engine 5.5, and I have a question regarding GPU compatibility.
I’m using an NVIDIA RTX 3090, and everything works perfectly. However, out of curiosity, I tested it on my old PC with a GTX 1060, and the training process doesn’t even start, instead displaying an error message.
After searching the forums, I couldn’t find a clear answer, so I’d like to ask:
Does ML Deformer require a GPU with Nvidia Tensor Cores or AMD equivalent (such as NVIDIA RTX 2000+ or AMD RX 6000+) ?
If so, this would explain why it doesn’t work on my GTX 1060.
Thanks in advance for your help and insights !

Hi @_aven_3
Let’s see…

It appears that you are experiencing some issues with ML Deformer in Unreal Engine 5.5, and based on what you’ve mentioned, the problem will most likely be caused by hardware compatibility issues in terms of the type of GPU you’re running.

The ML Deformer in Unreal Engine is heavily reliant on GPU acceleration for machine learning calculations, and as you’ve correctly surmised, it does require a GPU with dedicated hardware for deep learning computations, i.e., Tensor Cores. Tensor Cores are included in NVIDIA GPUs from the RTX 2000 generation and later (including your RTX 3090). These are designed to handle AI and machine learning computation, which is critical for activities like training and inference in Unreal Engine.

The GTX 1060 does not, however, possess Tensor Cores. While it can still be used to complete general-purpose GPU workloads, it lacks the specialized hardware required to efficiently implement ML Deformer’s training workloads. That’s why the process is not even starting on your GTX 1060—it simply doesn’t accommodate the workloads that ML Deformer needs to perform.

In conclusion:

Yes, ML Deformer requires a GPU with Tensor Cores (NVIDIA RTX 2000+ or higher), or an AMD equivalent (like the RX 6000 series).

The GTX 1060 doesn’t have the necessary hardware to run ML Deformer efficiently, which is why you’re encountering errors.

If you’re going to use ML Deformer for deep learning operations in Unreal Engine in the future, an upgrade to a Tensor Core GPU (like your RTX 3090) will be needed to get optimal performance.

1 Like