While writing a C++ bitmask, i was confounded at my inability to support/correctly handle more than 5 flag values. After an hour and a half of staring at the code, doubting my ability. I re-declared the bitmask’s Enum using bit shifting syntax 1<<0
instead of hex 0x1
and suddenly it worked as intended.
Preferred Syntax - Fails with flags0x16
or greater
UENUM(Meta = (Bitflags))
enum class ERichBlock : uint8 {
None = 0x0,
Under = 0x1,
Strike = 0x2,
Justify = 0x4,
Indent = 0x8,
Super = 0x16,
High =0x32
};
ENUM_CLASS_FLAGS(ERichBlock);
//Testing Assignment of mask value
ERichBlock mask = (ERichBlock::Under | ERichBlock::Strike); //Value: 3
/* VS2022 Debugger output */
mask & (ERichBlock::Under) //Value: 1
mask & (ERichBlock::Strike) //Value: 2
mask & (ERichBlock::Justify) //Value: 0
mask & (ERichBlock::Indent) //Value: 0
mask & (ERichBlock::Super) //Value: 2 <-- INCORRECT!
mask & (ERichBlock::High) //Value: 2 <-- INCORRECT!
Bitshifting Syntax
UENUM(Meta = (Bitflags))
enum class ERichBlock : uint8 {
None = 0,
Under = 1 << 0,
Strike = 1 << 1,
Justify = 1 << 2,
Indent = 1 << 3,
Super = 1 << 4,
High = 1 << 5
};
ENUM_CLASS_FLAGS(ERichBlock);
//Testing Assignment of mask value
ERichBlock mask = (ERichBlock::Under | ERichBlock::Strike); //Value: 3
/* VS2022 Debugger output */
mask & (ERichBlock::Under) //Value: 1
mask & (ERichBlock::Strike) //Value: 2
mask & (ERichBlock::Justify) //Value: 0
mask & (ERichBlock::Indent) //Value: 0
mask & (ERichBlock::Super) //Value: 0
mask & (ERichBlock::High) //Value: 0
Q1
Is there a specific reason why declaring the enum’s value using 0x
results in erroneous handling?
Q2
Are there other syntax gotchas one should be aware of when working with UE’s C++?