Referring to this (unanswered) question, I’m escalating the issue to what I believe is a bug. Bitmasks created in C++ and via blueprint return different values, with the Blueprint bitmask being incorrect.
Steps to reproduce:
In a C++ actor, define a bitmask as
UENUM( BlueprintType, meta=(Bitflags) )
enum ETest
{
BITFLAG1 = 1 UMETA( DisplayName = "BitFlag 1" ),
BITFLAG2 = 2 UMETA( DisplayName = "BitFlag 2" ),
BITFLAG3 = 4 UMETA( DisplayName = "BitFlag 3" )
};
ENUM_CLASS_FLAGS( ETest )
UCLASS()
class TEST_API ATestActor : public AActor
{
GENERATED_BODY()
public:
ATestActor();
UPROPERTY( BlueprintReadWrite, Category = "Test", meta=(Bitmask, BitmaskEnum=ETest) )
int32 TestBitmask;
};
In the constructor, set the first and third bits of the mask. The integer value of this mask should total 5 with these two bits set:
ATestActor::ATestActor()
{
TestBitmask = 0;
TestBitmask |= ETest::BITFLAG1;
TestBitmask |= ETest::BITFLAG3;
}
Next, create a new Blueprint derived from this actor. In the Begin event, get the value for TestBitmask and print the value on screen (should display 5, as expected).
Now add a Make Bitmask
node, set its Bitmask Enum value to ETest and set BitFlag 1 and BitFlag 3. As this replicates the C++ created bitmask, I expect the return value to be 5 as well. However, printing the return value gives me 18.
Any bitwise logic operations performed on this incorrect value will yield unexpected results.