Importing a Data Table CSV

Hello,

I’m trying to import a CSV as a data table but when I do so the only option for the DataTable row type is “GameplayTagTableRow”. I’m not sure how to create my own file for Unreal to know what DataTable row type to use. I’ve looked at the documentation but don’t see where I would save this custom C++ file to or where I would reference it in the Unreal Editor so it would know to pull that one instead of the current option.

The C++ file is just an example. You can also access the data via Blueprints.
Simply create a new Datatable asset and populate it with data or import a CSV in the asset editor.

If you make a struct in editor, you can then import data tables with the same content as the struct.

One thing to note is that Column 1 will always be NAME, and each entry needs to be unique in your CSV. Otherwise each column should match the name of the entry in the Struct you defined.

KVogler, yes, but when I create a new Datatable, it requires me to pick a structure and there is only one option available. I need to have a custom structure, not the provided one.

Stuckbug, how do I make a struct in the editor? I think perhaps an example would help, I am fairly new to blueprint.

Thank you for the speedy responses!

Edit: I found how to make a struct in the editor. Thanks for the help, this is so simple now! :slight_smile:

I have a few questions now but haven’t found a solution;

My data table is being looped in reverse for some reason. How can this be fixed?
How can I access a specific column of a data table as I am looping through it? **Edit: The “Break <struct name>” node seems to do this for me. **

d665dfc26d527eced2ef9791831d557dcbfae737.jpeg

In how far are the results reversed? I guess they are in 1-2-3-4 order.
The log displays new items at the top (therefore being “done!” in the top line). So it reports 1-2-3-4 …

Yeah that looks like its going in the correct order to me since the newest item pushes the older 1s down so DONE is the most recent and 1 is the first.

I am actually having an interesting problem where the first row in my dataset doesn’t want to work correctly, it just doesn’t seem to pick up any changes to it at all and ignores the row. Quick solution for me has just been to add a default value on it and push my actual elements down 1.

Ah, you are right, that is in the correct order. I didn’t realize it was pushing the prints down as it went.

ZoltanJr, the first row should be your column headers. Those wouldn’t be updated since they tell Unreal what each column represents.

I meant the first row of actual data not working correctly when working with it in blueprints, all of the headers are working correctly. I haven’t looked into it too much yet to debug though so it may be an issue somewhere else in my blueprints.

Something else I have been struggling with at the moment, does anybody know the correct format or if it is possible to add array data into a spreadsheet cell? I have 1 field in my dataset that is an array, which works fine to edit in Unreal, but I can not for the life of me figure out a format that it will accept for import which means I need to be doing it in editor all the time and updating the spreadsheet kills that data. Having the same issue with colours.

Im not sure the CSV format allows for nested data structures, unless you introduce a second delimiter symbol.
What you could do is creating an array field with only one entry and create the actual array data in data assets of their own. Then fetch them and copy them into the first one…
So kind of flattening out the data.

I would love to see an XML import option :rolleyes:

XML import would be really wonderful to see as well as just better functionality with how working with excel.

Yeah I was considering this too but it’s got it’s own issues and then has multiple spreadsheets and makes things harder to reference. There is no ability to configure UE4’s understanding of delimiters from what I have found, it just goes to basic CSV and only looks for the 1 delimiter. XML and proper JSON support would be great too.