Trying to get data from Datatable in C++ but uses BP Structure as it's underlying format

Hi all…

I have a data table that is populated with data. The structures that define the schema are all structs that were created in UE itself, not C++.

However, I am spinning up a new project/plugin and I need to be able to access to that data in C++. To do this, I have created new UStructs in C++ that inherit from FTableRowBase that take on the same name as the ones built in UE.

There does not seem to be any collisions between the UE created structs and the C++ structs. No compile time or runtime errors. My thought was, maybe because when you create a struct in UE, the “F” naming convention prefix is not applied to them as it is in C++? Not sure… But anyway.

So, I have figured out how to load the data table up in C++, that’s no problem, I can even convert some of it to JSON and print it out.

However, I would like to have the data table give me all the rows in struct format, so I am calling this function:

TArray<FMyStruct*> myStruct;
FString sContext = "";
_myDataTable->GetAllRows( sContext, myStruct );

Two things…

First, what is the context variable? I have searched around, but not finding what it is in this specific instance.

And second, this compiles and there is no runtime error. However, myStruct is always empty after the call. At this point, I know the _myDataTable has data in it, but for some reason it’s just not getting populated.

Is it possible that this function does not do what I think it does and I should be using a different function? Does the Context variable need to be something specific? And if so, what?

Is it because the DataTable was defined using the blueprint version of the structs and it is ignoring my C++ version despite being identical in data types, names and all that?

Any help will be greatly appreciated.

Thanks

the naming conventions for dataTypes are applied to those created in the Editor they are just invisible to the Editor user
"if I make an AMyActor : public AActor in C++ but then create a Blueprint from it in the Editor the stating “A” is removed. this is mostly to prevent confusion among non-programmers of “what are these random A/U/F/T in front of each of my things in the Engine”
the prefix letter is more of a helping hand so that programmers can be more expressive/specific with instance variable relative to their type (maybe something-something the type of the variable is expressed clearly in the name of it)

FTransform Transform;
TObjectPtr<AMyChildActor> MyChildActor;
for( AActor* Actor : ActorArray)

instead of
these are some example for the reason why Java and technically C# have such an over-reliance on this-> to prevent things like the following.

Transform transform;
TObjectPtr<MyChildActor> ChildActor280;
for( Actor* actor : actorArray)

the Unreal Style guide also speaks on capitalization, and readability which those prefixes help with.

importing a data type defined in the Editor into C++ would be a runtime resolution issue and I hate it just by thinking about it (because the efficiency of the C++ comes from the pre-compiled nature, and the “slowness” of Blueprints comes from the JIT nature which is why uses of reflection reaching into the Editor is done through Hard Runtime resolved path strings). I would strongly suggest re-parenting your DataTables to the C++ struct. The good news is that with dataTables you can export them to CSV or JSon create a new DataTable from your C++ struct then if they are really parity just re-import the exported Data into the new Tables. when you can prove they are “the same” then delete the DataTables defined in Blueprint Structs.

sometimes with runtime resolved path strings you might need to comb through the Log File for “Warning” and “Error” to make sure they are not having issues

Hi gardian…

Thanks for the response, it’s appreciated.
I agree with you on migrating to a C++ implementation. However, some of our data tables and structs are widely used through out a large BP code base. I’ve been here about 5 months and to be honest, I’m not 100% positive on how extensive.

I can take it to one of my peers and get his feedback. I know he’s more in tune with C++ over BP where it makes sense and I happen to know he’s not a fan of the fact that the structs were created in UE land, instead of C++ land, apparently it’s caused him some pain as well.

So is it safe to say then, if a UDataTable was created using structs created in UE, it will never work with an identical struct that was created in C++? Which I still find interesting that there are no compile time, or runtime errors given the fact they have the exact same name.

“the same name” I find a little confusing as if that were the case then it should generate a naming collision, but this might have to do with the way the naming conventions from C++ are applied on Blueprint created Data Types. what might be happening is when you pass the Editor struct in it is somehow be implicitly converted to the C++ type and silently failing but because it does not result in a null reference exception the program does not crash.

for how extensive these DataTables are used this can be done in a few ways first do an audit on each one through the Reference viewer (right click the DataTable or combined DataTable in Question, and select “Reference Viewer”) just in case as this will give you everywhere that might be broken at the final step. then when you go to replace the DataTable with the C++ backed version assuming parity you should be able to rename them to be the same names as the Editor backed versions, but this might still require having to track down build errors, but the data is probably harder to recreate then the blueprints as those should just be calling out indexes, and RowName anyways.

Hey gardian…

I hear you on the doing the ref viewer… We will see, there are a lot of duplicated tables that have the same data structure, but, different locational data in them… Lets just say it’s not optimal at this time to go and reparent a bunch of tables and all the legacy stuff… Maybe, but not today. :wink:

Something I did start looking into was taking the data table variable and calling GetTableAsJSON().

It works… Kind of… Let me explain…

Here is a quick and dirty 100% fictitious data and set of ustructs that mimic what we have in our project…

USTRUCT()
struct FCarAssembly : public FTableRowBase
{
    GENERATED_BODY(BlueprintType)

public:
    UPROPERTY(EditAnywhere, BlueprintReadWrite)
    FString AssemblyLocationCode;

    UPROPERTY(EditAnywhere, BlueprintReadWrite)
    bool IsDomestic;
};


USTRUCT()
struct FCarClassification : public FTableRowBase
{
    GENERATED_BODY(BlueprintType)

public:
    UPROPERTY(EditAnywhere, BlueprintReadWrite)
    FString ClassificationName;

    UPROPERTY(EditAnywhere, BlueprintReadWrite)
    int NumWheels;
};


USTRUCT()
struct FCarCompany : public FTableRowBase
{
    GENERATED_BODY(BlueprintType)

public:
    UPROPERTY(EditAnywhere, BlueprintReadWrite)
    FString ManufactureName;

    UPROPERTY(EditAnywhere, BlueprintReadWrite)
    TArray<FCarClassification> Classifications;
    
    UPROPERTY(EditAnywhere, BlueprintReadWrite)
    TArray<FCarAssembly> AssemblyLocals;

};

When in Unreal and I click on the table and say, export as JSON, I get this, It’s exactly as I would hope… Which I could easily work with.

	{
		"Name": "NewRow",
		"ManufactureName": "Honda",
		"Classifications": [
			{
				"ClassificationName": "Sedan",
				"NumWheels": 4
			},
			{
				"ClassificationName": "Sport",
				"NumWheels": 4
			}
		],
		"AssemblyLocals": [
			{
				"AssemblyLocationCode": "Japan",
				"IsDomestic": false
			},
			{
				"AssemblyLocationCode": "USA",
				"IsDomestic": true
			},
			{
				"AssemblyLocationCode": "Mexico",
				"IsDomestic": false
			}
		]
	},
	{
		"Name": "NewRow_0",
		"ManufactureName": "Toyota",
		"Classifications": [
			{
				"ClassificationName": "Sedan",
				"NumWheels": 4
			},
			{
				"ClassificationName": "Motorcycle",
				"NumWheels": 2
			}
		],
		"AssemblyLocals": [
			{
				"AssemblyLocationCode": "Japan",
				"IsDomestic": false
			},
			{
				"AssemblyLocationCode": "USA",
				"IsDomestic": true
			}
		]
	}
]

HOWEVER
When I call GetTableAsJSON on the Data Table variable, I get this…
Notice the array data, it comes as a string of name/value pairs, not the true hierarchical data as I would hope.

[
    {
        "Name": "NewRow",
        "ManufactureName": "Honda",
        "Classifications": [
            "(ClassificationName=\"Sedan\",NumWheels=4)",
            "(ClassificationName=\"Sport\",NumWheels=4)"
        ],
        "AssemblyLocals": [
            "(AssemblyLocationCode=\"Japan\",IsDomestic=False)",
            "(AssemblyLocationCode=\"USA\",IsDomestic=True)",
            "(AssemblyLocationCode=\"Mexico\",IsDomestic=False)"
        ]
    },
    {
        "Name": "NewRow_0",
        "ManufactureName": "Toyota",
        "Classifications": [
            "(ClassificationName=\"Sedan\",NumWheels=4)",
            "(ClassificationName=\"Motorcycle\",NumWheels=2)"
        ],
        "AssemblyLocals": [
            "(AssemblyLocationCode=\"Japan\",IsDomestic=False)",
            "(AssemblyLocationCode=\"USA\",IsDomestic=True)"
        ]
    }
]

Is there a way I can get the JSON out of the table and retain the full JSON hierarchy without it converting the arrays of of each object into a value/name pair string? Hoping to not have to parse all that out.

I was able to find this: it is supposedly like part 5 of a series by the creator. he talks a bit fast, and I have a bit of trouble with his accent, but maybe going through (the naming convention looks to be “UE5 C++” and maybe the first 5 in the series)

in this video he does at least seems to briefly show the code from the other videos, so you might not need to look at the other videos.

Right on!
Going to watch them now!

Yeah, he’s not showing how to deal with that in the same way I need to… What he’s doing is taking a JSON file and loading it up directly into a table, not pulling data out of the datatable and playing with the data.

I need to extract the data out of the data table directly in a proper hierarchical JSON format, not string arrays of child structures.

Odd that from the editor, you can export the data and it saves in a proper JSON format, but at runtime pulling it out, it comes in that other odd format.

when the Engine is Exporting the Table to JSon it “has all the time in the world” so it can take the time to deal with the nesting, where at runtime the default behavior of DataTables is for flat structures (it does not directly expect nesting of structures) so when it is traversing the DataTable it must be done “NOW” so it only formats 1 level of hierarchy.

I fed your example outputs into ChatGPT free and it spit out this:

#include "YourProjectName.h"
#include "YourClass.h"
#include "Dom/JsonObject.h"
#include "Dom/JsonValue.h"
#include "Serialization/JsonSerializer.h"

bool UYourClass::ConvertJsonString(FString& OriginalJsonString, FString& ConvertedJsonString)
{
    TSharedPtr<FJsonObject> RootJsonObject;
    TSharedRef<TJsonReader<TCHAR>> JsonReader = TJsonReaderFactory<TCHAR>::Create(OriginalJsonString);

    if (FJsonSerializer::Deserialize(JsonReader, RootJsonObject))
    {
        TArray<TSharedPtr<FJsonValue>> OriginalArray;
        RootJsonObject->TryGetArrayField(TEXT(""), OriginalArray);

        TArray<TSharedPtr<FJsonValue>> ConvertedArray;

        for (const TSharedPtr<FJsonValue>& OriginalValue : OriginalArray)
        {
            TSharedPtr<FJsonObject> OriginalObject = OriginalValue->AsObject();
            TSharedPtr<FJsonObject> ConvertedObject = MakeShareable(new FJsonObject);

            for (const auto& Pair : OriginalObject->Values)
            {
                FString Key = Pair.Key;
                TSharedPtr<FJsonValue> Value = Pair.Value;

                // Check if the value is an array of strings with the specified format
                if (Value->Type == EJson::Array)
                {
                    TArray<TSharedPtr<FJsonValue>> ArrayValues;
                    Value->AsArray(ArrayValues);

                    TArray<TSharedPtr<FJsonObject>> NestedObjects;

                    for (const TSharedPtr<FJsonValue>& ArrayValue : ArrayValues)
                    {
                        FString ArrayElementString;
                        ArrayValue->TryGetString(ArrayElementString);

                        TSharedPtr<FJsonObject> NestedObject;
                        if (ConvertStringToJsonObject(ArrayElementString, NestedObject))
                        {
                            NestedObjects.Add(NestedObject);
                        }
                    }

                    // Add the array of nested objects to the converted object
                    ConvertedObject->SetArrayField(Key, NestedObjects);
                }
                else
                {
                    ConvertedObject->SetField(Key, Value);
                }
            }

            ConvertedArray.Add(MakeShareable(new FJsonValueObject(ConvertedObject)));
        }

        TSharedPtr<FJsonObject> ConvertedRootObject = MakeShareable(new FJsonObject());
        ConvertedRootObject->SetArrayField(TEXT(""), ConvertedArray);

        // Serialize the converted JSON object to a string
        TSharedRef<TJsonWriter<TCHAR>> JsonWriter = TJsonWriterFactory<TCHAR>::Create(&ConvertedJsonString);
        FJsonSerializer::Serialize(ConvertedRootObject.ToSharedRef(), JsonWriter);

        return true;
    }
    else
    {
        return false; // Failed to deserialize the original JSON string
    }
}

bool UYourClass::ConvertStringToJsonObject(const FString& JsonString, TSharedPtr<FJsonObject>& JsonObject)
{
    TSharedRef<TJsonReader<TCHAR>> JsonReader = TJsonReaderFactory<TCHAR>::Create(JsonString);

    return FJsonSerializer::Deserialize(JsonReader, JsonObject);
}

In this code:

  1. We deserialize the original JSON string into a TSharedPtr<FJsonObject>.
  2. We iterate through the objects in the original array, and for each object, we create a new TSharedPtr<FJsonObject> for the converted structure.
  3. We iterate through the key-value pairs of each object and check if the value is an array of strings with the specified format (e.g., “(Key1=Value1,Key2=Value2)”). If it is, we convert each string into a nested JSON object and add them to an array.
  4. We add the converted array of nested objects to the converted object.
  5. We serialize the converted JSON object back into a string.
  6. The ConvertStringToJsonObject function is a utility function to convert a string into a JSON object.

Make sure to replace "YourProjectName" and "YourClass" with the appropriate values for your project and class.

as a general Rule Do not openly trust the output of a ML-LLM as they are designed around sounding somewhat human not being accurate considering the data they are building from could be wrong. Also the DataSet of ChatGPT-Free is limited to UE-4.27 being the newest release.

Right on… I will give this a try and see what happens!

For some reason it fails right here… In fact I was doing this exact same thing earlier and it was failing there as well. Right on the Deserialize( …, … ) function call.

bool UYourClass::ConvertJsonString(FString& OriginalJsonString, FString& ConvertedJsonString)
{
    TSharedPtr<FJsonObject> RootJsonObject;
    TSharedRef<TJsonReader<TCHAR>> JsonReader = TJsonReaderFactory<TCHAR>::Create(OriginalJsonString);

    **if (FJsonSerializer::Deserialize(JsonReader, RootJsonObject))**

Even if it got past that, I would still get a compile error here. ( I had everything commented out inside the IF test, because of this compile issue… )

TArray<TSharedPtr<FJsonValue>> OriginalArray;
        RootJsonObject->TryGetArrayField(TEXT(""), OriginalArray);

I get this VS syntax error that I’m not sure how to interpret.

'bool FJsonObject::TryGetArrayField(const FString &,const TArray<TSharedPtr<FJsonValue,ESPMode::ThreadSafe>,FDefaultAllocator> *&) const': 

 cannot convert argument 2 from                   'TArray<TSharedPtr<FJsonValue,ESPMode::ThreadSafe>,FDefaultAllocator>' 
to 
'const TArray<TSharedPtr<FJsonValue,ESPMode::ThreadSafe>,FDefaultAllocator> *&'

the syntax error looks to be expecting

RootJsonObject->TryGetArrayField(TEXT(""), const *OriginalArray);

because the TArray is a reference that “needs” to be passed as a pointer (choices were made) it shall be “Passed-by-Reference-to-Pointer”
this is a very control centered method of passing a variable where basically anything and everything can be changed about the variable up to and including changing the address the pointer points to.
by adding const it is “almost” the same as pass-by-pointer but don’t worry to much about that.
this “should” clear up the error on that line. also the TryGetArrayField() could fail, and it returns a true if it didn’t.

what do you mean by “it fails” in the call to Deserialize() does it do nothing? does it crash? does it give you garbage?

What I mean is, is that even though I am sending in the JSON data that this function generated.

FString jsonStr = _myTable->GetTableAsJSON();

the Deserializer always returns false for some reason…

What is even odder is that if I step into the function, and step through the entire thing line by line, it always hits the return true; statement. However, when I step past return true; there is about a 10 second pause where something is happening in the background, then it hits the closing } brace and back into my code, where retVal will be false.

I was hoping if could see where it’s failing in the Deserializer function, but… Can’t tell. I’m assuming there is a setting not setup properly in VS 2019 that I need to set???