This plugin delivers the hard part-native LibTorch integration-so you can load TorchScript models and run reliable CPU or CUDA inference without building a custom runtime. It ships with a hands‑on example actor for ResNet‑18 classification and YOLOv5s detection, including label loading, NMS, and IoU tracking, so you see the full pipeline working in‑engine and can move straight to your own models.
Plugin: Torchscript Runtime for Unreal - Stable Libtorch/Pytorch Inference (CPU & CUDA)
Fab link: https://www.fab.com/listings/25d7aebb-e1bf-44ca-87b3-3e33042a1bdc