I created the following enum for use as a bitmask.
UENUM(BlueprintType, meta = (Bitflags))
enum class EMovementTrackingFlags : uint8
{
None = 0x00,
X = 0x01,
Y = 0x02,
Z = 0x04
};
ENUM_CLASS_FLAGS(EMovementTrackingFlags);
I then implement it as a public variable as follows.
This causes it to appear in the Details panel as follows:
The values may be selected with a check mark, permitting mixed flags as intended. However, when I evaluate the flags I get erroneous results. Outputting the value of Movement Tracking Flags as an integer shows that “None” is equal to a value of 1, “X” is equal to a value of 2, “Y” is equal to a value of 4, and “Z” is equal to a value of 16. This is despite their exact values being set explicitly as 0, 1, 2, and 4, respectively.
Please note that the value of “None” is designated as a requirement in [the Unreal 4 Code Standard page][2].
I’ve found a work-around by elliminating all the explicit values and removing the “None” enumeration. X, Y, and Z take the expected values when this is the case.
UENUM(BlueprintType)
enum class EMovementTrackingFlags : uint8
{
X,
Y,
Z
};
ENUM_CLASS_FLAGS(EMovementTrackingFlags);
This necessitates work-arounds in order to evaluate the flags.
I have a theory that this is an issue with how the editor sets the value of the BitFlag instead of it being wrong inherently through C++. I’m going to dig in a bit more today.
This observation appears to be correct. I noticed when doing this that values set at the C++ level evaluated correctly, but values set through the editor would be offset as noted here. The only way to fix them was to explicitly set the integer value – not using hex or bit shift operators, but with a constant integer: 1, 2, 4. If I did that, everything behaved consistently.
Ok - After talking to the Epic staff this appears to be the correct solution. Posting here so others can utilize this awesome feature!
There does appear to be an inconsistency between the Coding Standards and how the Bitflag enum should be declared. See the Properties Documentation for the proper syntax.
Here is an example of how to declare a Bitflag enum:
UENUM(Blueprintable, Meta = (Bitflags))
enum class EHeroActionTypeFlags
{
Movement,
Attack,
Dodge,
Climb,
CustomReaction,
};
Note that I’m not using the ENUM_CLASS_FLAGS macro any more. Instead I’ve created three macros that handle the testing for me:
UE-32816 is tracking the issue but as noted this is actually the designed behavior. You need to shift your enum value to do all of your testing, setting, and clearing.
Does that mean enum need to be declared in code as uint32? I’m referring to above macros. I’d like to use it as well in code I’m just not sure yet how to declare them properly. It’s a mess
This is actually as-designed behavior - the enum values are currently assumed to be flag indices and not actual flag mask values. That is, the editor will compute (1 << N) where N is the value of the enum. This is consistent with how user-defined enums work on the editor side. Can see how it would be useful to be able to designate native C++ enums as literal mask values though, especially when used alongside ENUM_CLASS_FLAGS().
Fortunately, to get the OP’s desired behaviour all you have since UE-32816 is add the following to the top of your enum declaration:
UENUM(BlueprintType, meta = (Bitflags, UseEnumValuesAsMaskValuesInEditor="true"))
For those that need to expose the UENUM as a BlueprintType and need to get beyond the uint8 limit, you can define the UENUM using the namespace method:
Thank you for this addition! I think it was the missing piece preventing me from correctly using bitflags. I either had all bitflags values, but wrongly mapped to the previous one, or the last one was missing and the first one couldn’t be ticked. So, for reference, here’s a code sample that works the expected way on my machine (UE 4.17):
UENUM(BlueprintType, meta = (Bitflags, UseEnumValuesAsMaskValuesInEditor = "true"))
enum class EPointSequenceType : uint8
{
Linear = 1,
HorizontalSineWave = 2,
VerticalSineWave = 4,
Whirl = 8
};
UPROPERTY(EditAnywhere, Category = FLZSplinePCG, meta = (Bitmask, BitmaskEnum = "EPointSequenceType"))
uint8 SequenceType;
Is this still working for you guys? Im trying to use this for online and it works but i think my “TEST_BIT” isnt correct and its causing my if statement to fail