I did what you told and took the whole array and created a “temp” array with size 50, which then I send to a replicated variable in chunks until the “big” array is empty. It works to some extent.
If my big array size is 3449, it works. But the moment that I use a bigger array I receive this error:
LogNetPartialBunch: Error: Attempted to send bunch exceeding max allowed size. BunchSize=346302, MaximumSize=65536 Channel: [UActorChannel]
Can I change this maximum size? And if not, do you know any workarounds? I’ve looked into Fast TArray Replication but I’m not sure if thats the way to go.
Honestly I’m not super adept with it either. I know that in C++ you have much lower sizes for individual items. The overhead of BP means a lot of extra data being replicated. I’d go the C++ route if possible, plus you can use array slicing which reduces server memory usage.
Fast TArray Replication is fantastic, but it’s just a way to speed up the replication and has nothing to do with size constraints. If you can’t disseminate the CSV itself to the clients and let them build the struct data locally, then your only options are to trim as much data from the struct as you can, replicate in chunks, and/or rewrite your replicated actor (or create an actor component) in C++.
There used to be “MaxArraySize” and “MaxArrayMemory” or something to that effect in the project settings but I can’t seem to find them now. They would allow you to set the max chunk size. If all else fails you can take an extremely unsafe route and subclass UNetConnection in C++ and call SetUnlimitedBunchSizeAllowed, but you open yourself up to attack by doing so and possibly self-sabotage, especially on end user machines with limited throughput.
I was trying to replicate variables and thus, receiving this error.
Now using RPCs, it seems that it is solved. All other stuff I am still using, like splitting into smaller arrays and so… Thank you Jared and again, congrats on your plugin. Been using for about 2 years now
Do you offer a product that supports UE Bultin Datatable Runtime CSV Import/Export?
My situation is the Ability System InitStats{} Function requires a Datatable Input. I’m updating Local CSV from GSheets and need to update the UE Datatable prior calling InitStats{}. I could use the feature in other situations as well.
@TechLord thank you for your question! At present the plugin does not actually function with Data Tables as they exist in the engine but rather is a custom solution. This is a function I would like to add in time as my need for it becomes more present, but I don’t have the time right now. When I do add it in the future, it will be included in the base plugin for everyone who owns a copy.
You can still pull in data from Google or another CSV and and it row by row to an existing data table using a combination of RDT and a plugin like this!
Fill Struct array with RDT → Foreach on struct array → Per struct instance, add row to data table with a plugin like the one linked → use datatable in ability system
I apologize! I was on mobile and didn’t check the supported versions. Indeed, there are a few options for that on Fab, I’m glad you found one that works! Let me know if you need help interfacing the two.
For users looking for a 5.5 version, it was approved over a week ago but has not appeared in the launcher yet for some reason. I have contacted Epic about this. If you need a 5.5 version right now, please contact me and I’ll send the source over, but you will need to build binaries on your own.