Hey folks for those starting up on machine learning / playing in unreal engine, I figured I’d share my “NeuralData” Game Instance Subsystem. it worked very well for me so far and will hopefully expand some on the course.
First and foremost you need to create a new Game Instance Subsystem. For me I called my NeuralData as seen in the gist. Then populate it with the code provided here: Neural Data Subsystem for unreal engine. · GitHub
Then you want to initialize your subsystem with the model you want to use with something like this:
if (!NeuralSubsystem)
{
NeuralSubsystem = GetGameInstance()->GetSubsystem<UNeuralData>();
}
if (NeuralSubsystem)
{
auto NeuralInit = NeuralSubsystem->InitModel("/Script/NNE.NNEModelData'/Game/Models/FancyModel.FancyModel'", "NNERuntimeORTCpu");
if (NeuralInit)
{
UE_LOG(LogTemp, Warning, TEXT("Neural model loaded"));
const TArray<uint32> InputLayers = {1, 1234};
bInferenceReady = NeuralSubsystem->SetShapes(InputLayers, 3);
// check if not bound then add dynamic delegate
if (!NeuralSubsystem->OnResult.IsBound())
NeuralSubsystem->OnResult.AddDynamic(this, &USomeClass::HandleInferenceResult);
} else
{
UE_LOG(LogTemp, Error, TEXT("Cannot load Neural Model."));
}
}
In the above it gets the subsystem, initalizes a model under my Game/Models/FancyModel example that was previously created based on what is in the course, and if it is initialized it then also sets the Input Shapes in my example 1 x 1234 features.
At this point if it initialized correctly you can now run a classify with something like:
NeuralSubsystem->RunClassify(infData, true, LastReadSocketTimestamp);
Where InfData is a TArray of floats, the boolean is used if the TArray needs to be wrapped for LSTMs, and I made use of the timestamp in order to track the time it took to do inference on a model until the broadcast is received.
Then the final part would be effectively listening to the inference broadcast that would be sent by the model.
Note that for my NeuralData being primarily focused on classification I added both a softmax (multi class) and sigmoid (binary) classification functions which is needed to convert the logits to things usable by the rest of my unreal things!
And finally as shown above my model would broadcast the inference data result via standard delegate (I liked the delegate as it allowed me to bind to the inference result in both c++ and blueprints to keep things nice and clean). This made it very easy to just import a new model initialize it and set the shapes based on what I was testing, then run inference all day long with it working across the entire system. As a final note my InferenceData struct looks like this:
USTRUCT(BlueprintType)
struct FInferenceData
{
GENERATED_BODY()
UPROPERTY(BlueprintReadWrite, Category = "NeuralGPU")
int Cat = 0;
UPROPERTY(BlueprintReadWrite, Category = "NeuralGPU")
float Confidence = 0.f;
UPROPERTY(BlueprintReadWrite, Category = "NeuralGPU")
bool bIsValid = false;
UPROPERTY(BlueprintReadWrite, Category = "NeuralGPU")
FLocalDateTime Time;
uint64_t Timestamp = 0;
};```
FLocalDateTime in the example above is essentially just an adaption of FDateTime with Local Timezone and a few other things.
Hope this helps and saves some time for you folks that are just starting up!