Hello,
I wrote this code:
UENUM(BlueprintType, meta = (Bitflags))
enum class EMyEnum : uint8
{ // as bitmask
A, // 1
B, // 2
C, // 4
D, // 8
E, // 16
F, // 32
G, // 64
ERROR, // -128 ???
I, // 256
};
ENUM_CLASS_FLAGS(EMyEnum);
And now I don’t really get why all values are correct except ERROR value which is negative value in my case.
All values are converted well (including I) and in hex format G is correct and equals to 0x00000040 and I equals 0x00000100 but ERROR equals 0xFFFFFF80…
Do you have any idea why is that?
To add some context. I use DataAsset in which i defined a UPROPERTY like this:
UPROPERTY(EditAnywhere, meta=(Bitmask, BitmaskEnum = "EMyEnum"))
int id;
I created a few DA instances and in everyone I set this id to only one enum value (bitflag) and faced this issue.