C++ Bitflags in Unreal Engine 4

An overview over the different implementation methods for bitflags that are supported by the editor UI.
Published on 10/31/2021

The Basics

Bitflags are a common way to encode a combination of a set of boolean variables inside a single integer. It’s not unique to Unreal Engine or C++ but a widely used technique in all kinds of programming languages. The implementation details vary a bit and as we will see below there are two main ways to implement bitflag enums in Unreal C++.

Of course there are additional ways to implement and use bitflags, but we will concentrate on those that allow exposing the resulting bit mask integer variables as unreal properties so they can be viewed and edited (as bitmasks!) inside the UE4 editor.

For both implementations we need a reflected UEnum as foundation:

UENUM(BlueprintType, Meta = (Bitflags))
enum class EGreekLetters : uint8
{
    // Flags go here
};

The BlueprintType specifier makes the enum available for Blueprint enum properties.

Note that the Bitflags meta specifier does not change anything about the values of entries! It merely tells the UE4 editor to make this enum available for integer bitmask properties when created inside Bluerpint assets.

We also need an int32 property that holds the actual data, the bit mask:

UPROPERTY(EditAnywhere, meta = (Bitmask = "GreekLetters"))
int32 MyBitmask;

You can also use other integer types such as uint8, int64, etc but not all of them are fully usable in Blueprints graphs.

Implementation #1: Natural Numbers

Declaration

This is the preferred implementation for bitflags that are solely used in UE4, or for any bitflags that are supposed to be used with Blueprints. With this implementation, enum entries retain their default assigned natural numbers, e.g.

UENUM(BlueprintType, Meta = (Bitflags))
enum class EGreekLetters : uint8
{
    Alpha /* = 0 */,
    Beta  /* = 1 */,
    Gamma /* = 2 */,
    Delta /* = 3 */
};

I would reccommend not assigning any explicit values as it has no benefit with the added danger of messing up entries by accident.

Usage

These bitflags are always used to bitshift a literal 1 before adding them to the bitmask:

// Empty bitmask
int32 MyBitmask = 0;

// Add Alpha to it:
MyBitmask |= 1 << StaticCast<int32>(EGreekLetters::Alpha);
// Value of MyBitmask is 1 (0b0001)

// Add Beta to it:
MyBitmask |= 1 << StaticCast<int32>(EGreekLetters::Beta);
// Value of MyBitmask is 3 (0b0011)

// Check if Gamma is set
bool bHasGamma = MyBitmask & (1 << StaticCast<int32>(EGreekLetters::Gamma));
// Value of bHasGamma is false

Advantages

The main advantage of using this system is that you can fit a lot more possible entries into a uint8 based enum (which is required for Uenums). However you likely won’t be able to use all of them, because an int32 can only fit 31 entries (the sign bit is not usable for editor-exposed bitmasks). Also, this way of declaring entries is a bit less error-prone, because you don’t need to explicitly assign any values.

Implementation #2: Powers of Two

The alternative then is to assign explicit power-of-two enum values:

UENUM(BlueprintType, Meta = (Bitflags, UseEnumValuesAsMaskValuesInEditor = "true")))
enum class EGreekLetters : uint8
{
    Alpha = 0x0,
    Beta  = 0x1,
    Gamma = 0x2,
    Delta = 0x4
};

Depending on your preferences you can also use binary literals for assignment:

UENUM(BlueprintType, Meta = (Bitflags, UseEnumValuesAsMaskValuesInEditor = "true")))
enum class EGreekLetters : uint8
{
    Alpha = 0b0,
    Beta  = 0b1,
    Gamma = 0b10,
    Delta = 0b100
};

Usage

These bitflags can be used as-is for logical bit operations without any shifting:

// Empty bitmask
int32 MyBitmask = 0;

// Add Alpha to it:
MyBitmask |= StaticCast<int32>(EGreekLetters::Alpha);
// Value of MyBitmask is 1 (0b0001)

// Add Beta to it:
MyBitmask |= StaticCast<int32>(EGreekLetters::Beta);
// Value of MyBitmask is 3 (0b0011)

// Check if Gamma is set
bool bHasGamma = MyBitmask & StaticCast<int32>(EGreekLetters::Gamma);
// Value of bHasGamma is false

OpenUnrealUtilties Bitmask Helpers

To make bitmask manipulation a bit easier, I implemented some utility functions as part of my OpenUnrealUtilties plugin in BitmaskUtils.h.

Taking the example from above, this is what it would look like with the utility functions:

// Empty bitmask
int32 MyBitmask = 0;

// Add Alpha to it:
BitmaskUtils::SetBit(MyBitmask, EGreekLetters::Alpha);
// Value of MyBitmask is 1 (0b0001)

// Add Beta to it:
BitmaskUtils::SetBit(MyBitmask, EGreekLetters::Beta);
// Value of MyBitmask is 3 (0b0011)

// Check if Gamma is set
bool bHasGamma = BitmaskUtils::TestBit(MyBitmask, EGreekLetters::Gamma);
// Value of bHasGamma is false

Please note that the calls only look like this if you statically declare the sequence via a trait macro:

DECLARE_ENUM_SEQUENCE(EGreekLetters, EEnumSequenceType::Pow2);

This is the reccommended approach for custom enums, because it’s impossible that the sequence/values of entries change in different contexts. With this you get compile time checks that ensure you’re always using the correct bit operations!

If you don’t declare this trait information, you must explicitly pass template parameters for each function call, like so:

// For enums using natural number values
BitmaskUtils::SetBit<EGreekLetters, EEnumSequenceType::Natural>(MyBitmask, EGreekLetters::Alpha);
// For enums using power-of-two values
BitmaskUtils::SetBit<EGreekLetters, EEnumSequenceType::Pow2>(MyBitmask, EGreekLetters::Alpha);