In Game Money - Representing Numbers > Int32 (2,147,483,647)?

For the current project I’m working with, it is highly possible for the player to have wealth (ignoring cents) in the several hundred billions, as well as to go into debt. The [FONT=Lucida Console]int32 type has a maximum value of ±2,147,483,647, which is of course too small; [FONT=Lucida Console]uint32 lacks negatives and still only reaches 4,294,967,295. Furthermore, Blueprints don’t support unsigned integers, and I like to have as much stuff exposed to blueprints for ease of use.

What is the best way to get around this? What I’m thinking of right now is having two values, [FONT=Lucida Console]int32 Money and [FONT=Lucida Console]int32 Billions, where [FONT=Lucida Console]Money represents values of 9 digits (for example, 999,999,999) and [FONT=Lucida Console]Billions represents the 10th digit and beyond. This could also be useful in another way: after a while it might be nicer for the player to see funds as, say, 1.2bn versus 1,219,805,641.

Thoughts?

int64 / uint64 :wink:

You’ll have to write your own functions to perform math operations on them though, or convert to text etc. The FMath library only really works with <= 32-bit types for the most part

Make your own money struct.

I would not have negative money. It’s unsigned. Debt would also be a separate, unsigned value. As anyone with a credit card knows, it’s very possible to have money in your account, but debt on your card.