Hey, I’m generating quite big list of transforms in Construction Script (7-10k transforms) and I want to ‘save’ the list for further use.
For now I have a ‘lock’ boolean in construction script and when I generate the transforms once (it takes a while, but it generates ok) i check the boolean and now the construction script doesn’t re-generate all these transforms again, they stay ‘saved’ in the transform array variable. But every time I open the blueprint that has this big ‘saved’ list inside (or try to do anything with it) my editor lags horribly almost freezing for a few minutes, my HDD plays some distorted dubstep… So I guess that this is not a good way of storing large array values?
What do you think, what’s the better way of storing large transform lists that are generated once in Construction Script for further usage in UE?
My first idea, but not sure: I can right-click the array variable and select ‘Copy’. This copies all the transforms and I can paste it as a text - it looks like they’re all separated by comma. So maybe it would be possible to store it in DataTable? But DataTable needs a CSV-structured file and I don’t know how to format it with that large amount of data…
I would be grateful for some hints
Create SaveGame object in construction script from created ‘SaveGameForTransforms’ SaveGame class - the class contains one Transform type array.
When generating transforms in loop, store them directly into SaveGame.Transforms] array.
After generating, Save Game to Slot
… This works, all the transforms are saved to save file and can be read using Load Game from Slot.
It looks like a solution, but I wonder why adding the same transforms to a transform array variable in a regular BP freezes so badly, while adding them to SaveGame object doesn’t freeze?
Let’s say we’re in ‘TransformGenerator’ blueprint, in Construction Script:
ForLoop -> transforms.Add(generatedTransform); // Long freeze, HDD freaks out
ForLoop -> saveGameObject.transforms.Add(generatedTransform); // No freeze
Hmm sounds reasonable, but aren’t the SaveGame variables saved to HDD only after executing the ‘Save Game to Slot’? Maybe I’m mistaken, but I thought that if we push data to SaveGame object then it’s cached in memory (persistent) and saved to HDD only when executing Save Game to Slot function. I don’t have much experience with the save system though
I’ve tried copying my transform list to a Data Table (around 7k values) but it lags horribly when pasting to a row + when opening the DataTable. Like a 2-3min. computer freeze. The more values, the longer lag. So looks like DataTables are also not suitable for storing large sets of values… I’m quite in a dead end right now.
… Using SaveGame still works without that lag, but I haven’t tested if it works after packaging a game - because all the info is stored in savegame file in project folder, so after packaging it will be gone…?
Oh no, are you saying that Unreal Engine can’t handle alot of variables if we put them in the
Blueprints? I’ve made a game script that has got over two thousand variables in it.
Its about 90 pages long and all these variables my RPG sci-fi game uses is used to keep track of
everything, what system you’re in, how much stuff you collected or picked up
what things are changing in the game, keeping track of all the storyline dialog triggers
ARE whats put into the save file.
It’s freezing the editor when having a lot of values saved in array (either as a public variable or default blueprint value), I haven’t tested it with a lot of variables, so I don’t know.
I’m working on world generation for my game and it’s quite specific - semi-procedural with using premade tiles made in 3D app. These premade tiles have fixed grass instance transforms, but I need to spawn all instances dynamically (along with dynamically spawning/changing tiles) to make it work with my dynamic grass system. It works ok with my default grass spawner that detects terrain surface by LineTraces to determine instance transforms, but since my world is quite big, I’m trying to optimize world generation process by generating all the transforms beforehand (e.g. in construction script) and having an already-generated Transform list to spawn instances on each tile.
This approach works fine and cuts some precious seconds from world generation time, but the only way without the big in-editor lag is using the SaveGame to store all the transforms. Tomorrow I’ll test if this solution works after packaging a game though, since I’m not sure if in-editor saved SaveGame files would somehow transfer to packaged game.
… I don’t know if I’m not falling into the premature-optimization trap, but I want to check all the possibilities before I move forward
I’m having a similar problem with an array of structs in my grid blueprint. Basically I divide the world into square grids and I store the vectors of the corners in an array, as well as grid ID (an int). An array for a 30x40 grid (1200 entries in an array) is enough to significantly slow down the editor. And I was going to use this for the entire world, so it’s a significant problem. Would moving to c++ fix this issue?
I’m dissapointed to see that datatables is not gonna beable to handle my type of game if
its causing slowdown lag issues when cutting and pasting text variables into it…
This is such a pain. Its strange That’s that these 3d game editors slow all down or suffer from freezing issues when trying to store, cut or paste text. But I do notice it with the blueprints.
Sounds like a memory leak or caching issue going on or maybe its something else, but it like its trying to cache memory each time when trying to store these variables. I’ve seen how slow the editor becomes when you Add in new Variables to the blueprint, it gets slower with the more variables you add.
Some games contain lots of variables for doing all the checking and updating and defining of values and boundaries and rules that the game needs to follow, and if you have a game like I do with over 100 systems and worlds in it, a game similar to the size of games like mass effect then you need to define thousands of variables to cover all the systems and the 3d engine should beable to easily handle that without slowing down the editor… Because you have to cover also all the storyline dialog triggers and events as well.
I suffer no slowdowns at all when I write all the script code in a text file outside the 3d game engine environment in a normal text editor because its the only way I can develop my game ideas and test them out without all the hassels of the 3d engine slowing everything all
down. yet the second stage of game development is getting all your script variables
into the 3d game engine from off the script.
But the editor is not even optimised for it !!! Everything all starts to slow down when you start to add in the variables…
The script file is 174k lines long and it has no problem, runs smooth because its just text… I can have as much stuff as I want in the arrays and test all my ideas out because nothing slows down unless of course my script file gets over several million lines long and then you will notice a slowdown in the execution of the script, but the arrays can be as big as I like because there’s no limits to arrays. The biggest array list I have in my game script has over 250 slots and I have over several thousand arrays in my script. as my game is a slot based array infested system because the script is just a bunch of text variables with array lists… But when it comes to putting all my arrays and variables into these Unreal Blueprints I start to see slowdown problems going on with the editor.
and why has Unreal got this long hexademical Key Encrypton code signature ID each time you put some text in a widget?
what is the point of that?
Do you know how to do an interger array with text dialog so i can get different responses back from the npc
each time i step into the trigger Zireal07. It has to be done with text draw or with a text widget. I
can’t use print string because that’s for editor debugging only.
In the packaged game, could you have it generate the save file on first-run, if the save doesn’t already exist? i.e., it will check if it exists and if not, generate it. I feel like there is a better solution to this problem though. If you get the format right in a csv file and not have to edit it again, is the performance adequate when loading from that, or is the save game still better?
@tozan You won’t have this problem, as you will never need to load your entire script at one time in the game. Split your dialog up into logical segments.
Rama has a blueprint node that can import text files, though you would probably be better off using csv files so you can more easily incorporate meta deta into your in-game script.
I’m not sure, but if you divide the world during gameplay then probably C++ will help, you can check the Nativize option - it will convert your blueprints into C++, resulting in a good speed boost.
@tozan Yeah, I’m also curious why pasting these values into BP variable needs to much RAM, while pasting them into e.g. a regular text file is almost instant, lag-free… Maybe it’s not intended, some memory leak? But maybe someone with better low-level knowledge will help us understanding that
That’s a good idea, if saves from project folder are not transfered into packaged game, it could generate the values once e.g. at first game launch. Yes, I also feel that this is a workaround-type solution, not really solid… But if all the other possibilities to store large values cause massive lags, I might have no choice.
By CSV file you mean DataTable asset stored in project? It lags horribly when pasting the values into DataTable row (and when opening it), but I haven’t tried loading from it yet. I’ll try it soon, maybe it won’t lag when getting these values from DataTable - then I could try to survive these long freezes when pasting values, as it needs to be done only once per tile (I’ll have like several dozens of different tile types with different transform values, but storing them would be a one-time operation). Thanks for the ideas!
I ended up storing these large transform arrays in DataTable. It still lags for 2/3 minutes with ~15k values, but in my case I only need to store them once, so I can survive that… When loading them from DataTable, there is no lag, so that’s the solution that I’ll follow for now.
I’ve created a report about the freezes : Editor freeze/memory leak when having many array values - Programming & Scripting - Epic Developer Community Forums
… I’m still not sure if this is some kind of memory leak/bug or ‘intended’ behavior… But I just don’t understand why does it take so long to just paste these values into editor, while e.g. pasting them into a regular text file results in zero lag. Maybe someone with better low-level knowledge could shed some light on this?
Alternatively, keep the source *.csv file in the project folder, and simply overwrite it as needed. The editor should update it automatically ***providing ***it is not open anywhere else.
Thank you, that looks like a very good way to prevent all these freezes - it freezes only when storing large amount of values inside an array, but if every value is a separate row, then it’s ok. This solution needs a separate DataTable for every transform list, and in my case I need 20-30 of sets like that. This would probably also require to loop through all DataTable rows to get all the values, but I’m not sure if it would make some performance hit.
… I’ll test it on my end soon, thanks again!
… Of course the lag is still there when we need to get all these generated transforms from construction script though, as far as I know the only way is to set the transforms as a public array variable and right click -> Copy.
Not sure how your system is set up and whether this is applicable, but storing 30 columns of transforms in a single DataTable is definitely doable. I can imagine it being somewhat cumbersome to handle, though. You could write a small manager that extracts the data set and loads it up into an array for further processing and much quicker iteration. You can dedicate a column or two to tracking purposes.
I worked with a large enough DataTable to need another, much smaller DataTable that just tracked pertinent indexes of the large DT. Think simplified lookup table.