I have a FMemoryReader that’s deserializing a packet of network data into two integers and another TArray like so:
FMemoryReader reader = FMemoryReader(CommandBytes);
reader << ModValue; //an int32
reader << IntValue; //another int32
reader << Metadata; //Metadata is a TArray<uint8>
On occasion, this packet of data may become corrupted (technically, the decryption fails), meaning bogus data is in the CommandBytes variable.
Unfortunately, this means that upon reaching the reader << Metadata;
portion, it’s possible that the array length integer is completely wrong, and the deserializer ends up trying to create an unreasonably large TArray.
I’ve tried setting limits like so:
reader.SetLimitSize(CommandBytes.Num());//Accident forgiveness
reader.ArMaxSerializeSize = CommandBytes.Num();
But it seems to have no effect?
I already know the maximum number of bytes it should deserialize (courtesy of the source TArray, it’s not compressed in any way), so I just need to prevent it from trying to deserialize into a massive TArray (logic elsewhere validates the results, the packet would just be discarded).
How can I stop this insanity?