Problem with binary literals in enums


UENUM(BlueprintType)
enum class CategoryFlags : uint8 {
	NONE =        0b00000000  UMETA(DisplayName = "None"),
	RANGED =      0b00000001  UMETA(DisplayName = "Ranged"),
	MELEE =       0b00000010  UMETA(DisplayName = "Melee"),
	PRIMARY =     0b00000100  UMETA(DisplayName = "Primary"),
	PRIMARY_R =   0b00000101  UMETA(DisplayName = "Primary Ranged"),
	PRIMARY_M =   0b00000110  UMETA(DisplayName = "Primary Melee"),
	SECONDARY =   0b00001000  UMETA(DisplayName = "Secondary"),
	SECONDARY_R = 0b00001001  UMETA(DisplayName = "Secondary Ranged"),
	SECONDARY_M = 0b00001010  UMETA(DisplayName = "Secondary Melee"),
	ARMOUR =      0b00010000  UMETA(DisplayName = "Armour"),
	MISC =        0b00100000  UMETA(DisplayName = "Misc")
};

This code snippet doesn’t seem to compile. It works fine as a regular non-UENUM enum with the UMETA tags removed.

Is there some way to get them working, are binary literals simply not usable with UENUMS, or is this a bug?

The errors it gives are:
Missing ‘}’ in ‘Enum’ on the NONE line
and with the first literal replaced with 0
Explicitly specified enum values must be greater than any previous value and less than 256 on the next line

0b00000001?
This is a bit-array value? First time i see this.
Maybe the UENUM is also not aware this is possible?

Try



UENUM(BlueprintType)
enum class CategoryFlags : uint8 {
	NONE =        0x00  UMETA(DisplayName = "None"),
	RANGED =      0x01  UMETA(DisplayName = "Ranged"),
	MELEE =       0x02  UMETA(DisplayName = "Melee"),
	PRIMARY =     0x04  UMETA(DisplayName = "Primary"),
	PRIMARY_R =   0x05  UMETA(DisplayName = "Primary Ranged"),
	PRIMARY_M =   0x06  UMETA(DisplayName = "Primary Melee"),
	SECONDARY =   0x08  UMETA(DisplayName = "Secondary"),
	SECONDARY_R = 0x09  UMETA(DisplayName = "Secondary Ranged"),
	SECONDARY_M = 0x0A  UMETA(DisplayName = "Secondary Melee"),
	ARMOUR =      0x10  UMETA(DisplayName = "Armour"),
	MISC =        0x20  UMETA(DisplayName = "Misc")
};


Yeah, binary literals are a part of C++14. Not surprising you haven’t seen em before.

Hex literals seem to work fine, but if I’m gonna use those, I might as well just use regular numbers instead.

Does UE support UE 14?
If work with hex/decimal values and not binary values, while UE14 supported, then might be a bug :slight_smile: