I have plenty of experience with regular bitmasks/bitflags, but trying to expose some of that logic directly to blueprint, I’ve realized blueprint handles them extremely weirdly.
Not only are they represented as int32s instead of uint8s, not even uint32s, but they increment in powers of powers of 2, rather than powers of 2. Trying to then pass this into C++ results in completely wrong flags.
As you know, normal bitmasks literally take up a single bit per flag, so they end up evaluating to powers of 2 when converted to integers.
Normal bitmask: 0, 2^0 (1), 2^1 (2), 2^2 (4), 2^3 (8), 2^4 (16), 2^5 (32), 2^6 (64)
Blueprint bitmask: 0, 2^2^0 (2), 2^2^1 (4), 2^2^2 (16), 2^2^3 (256), 2^2^4 (65536),
Converting this wouldn’t be too hard, but it would require quite a few calls to Log2, and would need a dedicated blueprint function- something I always try to avoid when possible.
It also seems like such a massive difference that surely there’s some proper conversion I’m missing.
Solved in post 2, but more information on passing them to native C++ functions in posts 4 and 5
Ah- thank you, I was missing the =“true” part of UseEnumValuesAsMaskValuesInEditor = “true”.
But an unfixed problem I found while manually converting those values to be 2^x- converting the result back into the enum doesn’t work (-> byte → enum).
For example, in this setup:
The resulting function parameter is just Ignore State and Manual Possess, ignoring the others. Passing in the int32 and then static casting it to the flag works.
Ah- so that wasn’t quite what I wanted since I would still need a dedicated function since I’m not passing around int32s or uint8s directly in the native C++ code.
But using the Bitmask+BitmaskEnum specifiers did exactly what I wanted. Now the parameter accepts bitmasks rather than the enum in blueprint, so it actually works with the native C++ function.
To be noted: the editor will complain in the logs if you don’t have the fully specified path, and are just using the enum name. Should be “/Script/{ProjectName/ModuleName}.{EnumName}”
Though unfortunately there is no automatic conversion from BitmaskInteger to BitmaskByte, so you have to manually type out to byte.
Yeah I tried with the long script path but I had the single name of the enum instead of the plural version and I thought it was broken, so I went back to the simpler version.
Didn’t go back to testing it through script, though that might be the better way if all compiles ok.