Comprehensive GameplayAbilities Analysis Series

Hey folks!

Introduction

Recently I’ve taken the deep dive into the GameplayAbility system, in part due to the pioneer efforts of @anonymous_user_843b99c6 in his incredible forum thread. However, my circumstances are a bit different so I thought it’d be worth chronicling them for future reference so that someone might find it useful. The GameplayAbility system is extremely vast, obtuse, difficult to wrap your head around and insanely amazing. We know that Epic uses it internally on both Fortnite and Paragon so it’s already been battle-worn and proven to work. Unfortunately, as there was virtually no information on it some 10-12 months ago, I’ve rolled out my custom system. Just recently, having inspected some of the stuff in the GA system I’ve realized that I’ve inadvertently stumbled down some similar design paradigms and solutions in my own system.So as I was headed for a major refactoring anyway (half-***** some new features due to deadlines) I thought to myself what the hoo-ha, might as well take a dive off the deep end. Long story short, this means that this forum thread will hopefully see me rip out the guts of my original combat system and replace it with GA, hopefully giving you guys a better understanding of the system in the process.

Also, thanks to everyone in the Discord chat for helping me in my research.

IMPORTANT: This will not be a well structured tutorial series. It is, for all intents and purposes a “brain dump”. It is intended as an information archive of my research results to help other nutjobs who are also, like me, trying to unravel this beautiful engine subsystem. There’ll be useful information, but I can’t vouch for any formatting, proofreading or user-friendliness.

Prerequisites

These instructions are not an introductory C++ course. The GA system is obtuse and confusing for even the most decorated code warriors, they will not be good material for your 101 course. With that being said I will not go through how to enable the GA system as it really boils down to a) enable plugin and b) add the module to your PublicDependency list. I guess I did go over how to enable the system…

Attribute Sets

When thinking about how to best approach the transition I thought it’s probably best to port my data holders to GA first, i.e. my combat stats. Once I have those I’ll be able to rebuild some of my combat functionality and combat abilities. Thus, my first stop were Attribute Sets. While at first they seem to be simple data holders, they have proven to be deceptively complex. In my old system stats were simple gameplay tags tied together to a float value. Meaning that if a designer wanted to add a new stat he would simply add a new Stat.Something gameplay tag and that’s it, it could be assigned to any actor owning my CombatComponent and be used as a resource. Compared to that, Attribute Sets are much more rigid. While at first that made me frown quite a bit, I realized that in reality, you define your game’s stats once and be done with it, very rarely, if ever, will you be adding some fancy arbitrary stats like that. Thus, it was an acceptable tradeoff for me, but keep this limitation in mind when considering using this system.

Before creating the attribute sets though we’ll first need a UAttributeSystemComponent. Since I will be extending the functionality I have created my own subclass of the component, let’s call it UBlaAttributeSystemComponent for the purpose of this document. Furthermore, chances are that my player and my AI characters will further subclass it so I need to facilitate those being able to change the class of the parent’s component. To clarify - I have a ABlaCharacter whose parent is ACharacter and who branches out further into AAICharacter and APlayerCharacterBase. This means, in my ABlaCharacter I have the following code:


UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = Abilities, meta = (AllowPrivateAccess = "true"))
class UBlaAbilitySystemComponent* AbilitySystem;

But, equally important, below that in the same class there’s this line:


static FName AbilitySystemName;

If we head over to the matching .cpp file there’s these two lines:


//This is outside of any function, just below my includes
FName ABlaCharacter::AbilitySystemName(TEXT("AbilitySystem"));

//In the constructor
AbilitySystem = CreateDefaultSubobject<UBlaAbilitySystemComponent>(ABlaCharacter::AbilitySystemName);

One might wonder why jump through all these hoops when I could’ve just punched in “AbilitySystem” in the CreateDefaultSubobject argument and be done with it. The reason is that this now allows me to do something fancy, namely this:


AAICharacter::AAICharacter(const FObjectInitializer& OI)
	: Super(OI.SetDefaultSubobjectClass<UAIMovementComponent>(ACharacter::CharacterMovementComponentName)
		.SetDefaultSubobjectClass<UAIAbilitySystemComponent>(ABlaCharacter::CombatComponentName))  { ..... }

See what happened there? My AICharacter, which subclasses ABlaCharacter has changed the class of a component in its parent, or rather, two components. This allows me to subclass any parent component without adding a new one.

Well, this is a good first step, we have the UAbilitySystemComponent (ASC) set up and subclassed. Now it’s time to define some attributes. It is important to note that an ASC can hold on to multiple attribute sets. There are several and ways that you can split attributes. I personally had a UCoreAttributeSet which only defined health-based attributes and a UCombatAttributeSet which defined ye ol’ traditional RPG stats like strength and such. The reason for this peculiar split was because in my game I might have critters which can’t fight back, but still require a health property.

So, to start off, I’ve set up the following attributes:

Health - Current health
MaxHealth - The “fixed” max health. Most of the max health will come from vitality, this is just a way to set up an arbitrary base.
HealthRegenPerSecond - Self-explanatory.
Vitality - Self-explanatory
VitalityHealthBonus - “How much health am I getting from my current Vitality amount”
HealthRegenPerVitality - How much does vitality effect health regeneration?

The code for these looks like this:



	//The current health of the attribute set owner
	UPROPERTY()
	FGameplayAttributeData Health;
	//The maximum health
	UPROPERTY()
	FGameplayAttributeData MaxHealth;
	//How much HP is restored per second
	UPROPERTY()
	FGameplayAttributeData HealthRegenPerSecond;
	//Vitality increases health
	UPROPERTY()
	FGameplayAttributeData Vitality;
	//How much health the owner ASC gains per point of vitality
	UPROPERTY()
	FGameplayAttributeData VitalityHealthBonus;
	//How much health regen the owner ASC gains per point of vitality
	UPROPERTY()
	FGameplayAttributeData HealthRegenPerVitality;

You’ll notice that all the attributes use FGameplayAttributeData. This is basically just a wrapper struct that holds 2 floats - 1 for the current value and 1 for the base value of a stat. This is used to facilitate things like temporary buffs to a given stat etc.

A UAttributeSet has several useful function, but there are four that are the most important. Below I am pasting the data on them from the UAttributeSet.h header file directly:


	/**
	 *	Called just before modifying the value of an attribute. AttributeSet can make additional modifications here. Return true to continue, or false to throw out the modification.
	 *	Note this is only called during an 'execute'. E.g., a modification to the 'base value' of an attribute. It is not called during an application of a GameplayEffect, such as a 5 ssecond +10 movement speed buff.
	 */	
	virtual bool PreGameplayEffectExecute(struct FGameplayEffectModCallbackData &Data) { return true; }
	
	
	/**
	 *	Called just before a GameplayEffect is executed to modify the base value of an attribute. No more changes can be made.
	 *	Note this is only called during an 'execute'. E.g., a modification to the 'base value' of an attribute. It is not called during an application of a GameplayEffect, such as a 5 ssecond +10 movement speed buff.
	 */
	virtual void PostGameplayEffectExecute(const struct FGameplayEffectModCallbackData &Data) { }


	/**
	 *	Called just before any modification happens to an attribute. This is lower level than PreAttributeModify/PostAttribute modify.
	 *	There is no additional context provided here since anything can trigger this. Executed effects, duration based effects, effects being removed, immunity being applied, stacking rules changing, etc.
	 *	This function is meant to enforce things like "Health = Clamp(Health, 0, MaxHealth)" and NOT things like "trigger this extra thing if damage is applied, etc".
	 *	
	 *	NewValue is a mutable reference so you are able to clamp the newly applied value as well.
	 */
	virtual void PreAttributeChange(const FGameplayAttribute& Attribute, float& NewValue) { }

	/**
	 *	This is called just before any modification happens to an attribute's base value when an attribute aggregator exists.
	 *	This function should enforce clamping (presuming you wish to clamp the base value along with the final value in PreAttributeChange)
	 *	This function should NOT invoke gameplay related events or callbacks. Do those in PreAttributeChange() which will be called prior to the
	 *	final value of the attribute actually changing.
	 */
	virtual void PreAttributeBaseChange(const FGameplayAttribute& Attribute, float& NewValue) const { }

In short, PreAttributeChange and PreAttributeBaseChange can be called whenever an attribute changes no matter what changed it or why. PreGameplayEffectExecute and PostGameplayEffectExecute are called when attributes have been modified via a gameplay effect (more on that later, just pretend it means “a buff or debuff” for now). The simplest use case for these (the PreAttributeBaseChange in particular) is clamping. This essentially looks like this:


void UCoreAttributeSet::PreAttributeBaseChange(const FGameplayAttribute& Attribute, float& NewValue) const
{
	if (Attribute == HealthAttribute())
	{
		NewValue = FMath::Clamp(NewValue, 0.f, MaxHealth.GetCurrentValue());
	}
}


Important: The above code is incorrect. Health should be clamped in PreAttributeChange, not in PreAttributeBaseChange. The latter will be called only when the base health value has changed, while the latter will be fired on every health modification (like temporary health buffs etc.)

Notice that the function takes a FGameplayAttribute& (NOT a FGameplayAttributeData) and that it’s being compared to a HealthAttribute() function. This function looks like this:


//.h
static FGameplayAttribute HealthAttribute();

//.cpp
FGameplayAttribute UCoreAttributeSet::HealthAttribute()
{
	static UProperty* Property = FindFieldChecked<UProperty>(UCoreAttributeSet::StaticClass(), GET_MEMBER_NAME_CHECKED(UCoreAttributeSet, Health));
	return FGameplayAttribute(Property);
}

This is nice and all, but you will have to write this for every single attribute that you want to compare or modify in any shape or form. Obviously this gets cumbersome very fast, and the code is 99% identical sans the difference in names. Meaning that this is a perfect scenario for a custom macro. Or rather, 2 macros in my case, 1 for the header and 1 for the cpp. The DECLARE_ macro will take an attribute and create the header function for it and the DEFINE_ macro will create the implementation. Source and sample usage are as follows:


//Macro source, I've put it in a separate AttributeMacros.h file
#define DECLARE_ATTRIBUTE_FUNCTION(PropertyName) static FGameplayAttribute PropertyName##Attribute();

#define DEFINE_ATTRIBUTE_FUNCTION(PropertyName, ClassName) 																							\
FGameplayAttribute ClassName##::PropertyName##Attribute()																								\
{																																						\
	static UProperty* Property = FindFieldChecked<UProperty>(ClassName##::StaticClass(), GET_MEMBER_NAME_CHECKED(ClassName, PropertyName));				\
	return FGameplayAttribute(Property);																												\
}

//Usage

//.h
DECLARE_ATTRIBUTE_FUNCTION(Health);

//.cpp (Anywhere outside a function)
DEFINE_ATTRIBUTE_FUNCTION(Health, UCoreAttributeSet); //<----UCoreAttributeSet needs to be the same of your actual attribute set

This will save you quite a bit of boilerplate for setting up all attributes.

One thing to note earlier in the document beginning is that me clamping Health between 0 and MaxHealth is incorrect. This is because I also have the VitalityHealthBonus attribute, so it should clamp between 0 and MaxHealth + VitalityHealthBonus. This is a common scenario for me as I have many “2-part” attributes like this. To speed up access to these I’ve created a third macro that defines a function like e.g. GetMaxHealthIncludingVitalityBonus(). It looks like this:


//The macro
#define DECLARE_NAMED_COMBINED_STAT_GETTER(BaseProperty, BonusProperty, FunctionName)__forceinline float FunctionName##() const	\
{																				\
	return BaseProperty##.GetCurrentValue() + BonusProperty##.GetCurrentValue();\
}

//Example in header file (the function is __forceinline)
DECLARE_NAMED_COMBINED_STAT_GETTER(MaxStamina, MaxStaminaPerEndurance, GetMaxStaminaIncludingEnduranceBonus);

This is of course completely optional and won’t make a difference for the core setup.

Now that we actually have some attributes, let’s tell our ASC that we have them. The ASC itself has a TArrayDefaultStartingData array, but that seems to be deprecated, as it depends on you providing a specific attribute together with a non-optional float curve to initialize it from. We’ll see in a bit why that approach isn’t the best idea when we get to proper ways of initializing attributes. I found that this approach works just fine:


//In BlaCharacter.h
UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Bla|Character")
TArray<TSubclassOf<class UAttributeSet>> AttributeSets;

//In ABlaCharacter::BeginPlay()
if (AbilitySystem != nullptr)
	{
		AbilitySystem->InitAbilityActorInfo(this, this);
		for (TSubclassOf<UAttributeSet>& Set : AttributeSets)
		{
			AbilitySystem->InitStats(Set, nullptr);
		}

		UAbilitySystemGlobals* ASG = IGameplayAbilitiesModule::Get().GetAbilitySystemGlobals();
		FAttributeSetInitter* ASI = ASG->GetAttributeSetInitter();
		ASI->InitAttributeSetDefaults(AbilitySystem, UBlaGameplayStatics::GetTagLeafName(AbilitySystem->ClassTag), 1, true);
	}

Disregard the last 3 lines for now. We’ll build our way up to those as we explore attribute initialization. Just remember that this is where you actually fire off the entire initialization chain eventually.

You probably noticed, but some of the core attributes listed above depend on each other (both HealthRegenPerVitality and VitalityHealthBonus depend on Vitality) and we’ll eventually get to setting that up, but for now let’s take a look at how attributes are populated at all.

Digging through AttributeSet.h you can find a FAttributeInitter struct and the following comment above:


/**
 *	Helper struct that facilitates initializing attribute set default values from spread sheets (UCurveTable).
 *	Projects are free to initialize their attribute sets however they want. This is just want example that is 
 *	useful in some cases.
 *	
 *	Basic idea is to have a spreadsheet in this form: 
 *	
 *									1	2	3	4	5	6	7	8	9	10	11	12	13	14	15	16	17	18	19	20
 *
 *	Default.Health.MaxHealth		100	200	300	400	500	600	700	800	900	999	999	999	999	999	999	999	999	999	999	999
 *	Default.Health.HealthRegenRate	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1
 *	Default.Health.AttackRating		10	10	10	10	10	10	10	10	10	10	10	10	10	10	10	10	10	10	10	10
 *	Default.Move.MaxMoveSpeed		500	500	500	500	500	500	500	500	500	500	500	500	500	500	500	500	500	500	500	500
 *	Hero1.Health.MaxHealth			100	100	100	100	100	100	100	100	100	100	100	100	100	100	100	100	100	100	100	100
 *	Hero1.Health.HealthRegenRate	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1	1 	1	1	1	1
 *	Hero1.Health.AttackRating		10	10	10	10	10	10	10	10	10	10	10	10	10	10	10	10	10	10	10	10
 *	Hero1.Move.MaxMoveSpeed			500	500	500	500	500	500	500	500	500	500	500	500	500	500	500	500	500	500	500	500
 *	
 *	
 *	Where rows are in the form: [GroupName].[AttributeSetName].[Attribute]
 *	GroupName			- arbitrary name to identify the "group"
 *	AttributeSetName	- what UAttributeSet the attributes belong to. (Note that this is a simple partial match on the UClass name. "Health" matches "UMyGameHealthSet").
 *	Attribute			- the name of the actual attribute property (matches full name).
 *		
 *	Columns represent "Level". 
 *	
 *	FAttributeSetInitter::PreloadAttributeSetData(UCurveTable*)
 *	This transforms the CurveTable into a more efficient format to read in at run time. Should be called from UAbilitySystemGlobals for example.
 *
 *	FAttributeSetInitter::InitAttributeSetDefaults(UAbilitySystemComponent* AbilitySystemComponent, FName GroupName, int32 Level) const;
 *	This initializes the given AbilitySystemComponent's attribute sets with the specified GroupName and Level. Game code would be expected to call
 *	this when spawning a new Actor, or leveling up an actor, etc.
 *	
 *	Example Game code usage:
 *	
 *	IGameplayAbilitiesModule::Get().GetAbilitySystemGlobals()->GetAttributeSetInitter()->InitAttributeSetDefaults(MyCharacter->AbilitySystemComponent, "Hero1", MyLevel);
 *	
 *	Notes:
 *	-This lets system designers specify arbitrary values for attributes. They can be based on any formula they want.
 *	-Projects with very large level caps may wish to take a simpler "Attributes gained per level" approach.
 *	-Anything initialized in this method should not be directly modified by gameplay effects. E.g., if MaxMoveSpeed scales with level, anything else that 
 *		modifies MaxMoveSpeed should do so with a non-instant GameplayEffect.
 *	-"Default" is currently the hardcoded, fallback GroupName. If InitAttributeSetDefaults is called without a valid GroupName, we will fallback to default.
 *
 */

That is certainly a useful comment but also quite a bit to digest. In short - set up a curve table and read it into your attributes. Simple enough… except it’s not in my case, because my game has for all intents and purposes infinite levels. But let’s step back for a second. The FAttributeSetInitter declared 4 functions:


	virtual void PreloadAttributeSetData(const TArray<UCurveTable*>& CurveData) = 0;
	virtual void InitAttributeSetDefaults(UAbilitySystemComponent* AbilitySystemComponent, FName GroupName, int32 Level, bool bInitialInit) const = 0;
	virtual void ApplyAttributeDefault(UAbilitySystemComponent* AbilitySystemComponent, FGameplayAttribute& InAttribute, FName GroupName, int32 Level) const = 0;
	virtual TArray<float> GetAttributeSetValues(UClass* AttributeSetClass, UProperty* AttributeProperty, FName GroupName) const { return TArray<float>(); } 

…but they’re all empty. So let’s take a look at the “example game code usage” line from the instructions comment:


IGameplayAbilitiesModule::Get().GetAbilitySystemGlobals()->GetAttributeSetInitter()->InitAttributeSetDefaults(MyCharacter->AbilitySystemComponent, "Hero1", MyLevel);

Simple enough… let’s see what’s inside UAbilitySystemGlobals::GetAttributeSetInitter():


FAttributeSetInitter* UAbilitySystemGlobals::GetAttributeSetInitter() const
{
	check(GlobalAttributeSetInitter.IsValid());
	return GlobalAttributeSetInitter.Get();
}

Alright… so where and how is GlobalAttributeSetInitter created? Just above there is:



/** Initialize FAttributeSetInitter. This is virtual so projects can override what class they use */
void UAbilitySystemGlobals::AllocAttributeSetInitter()
{
	GlobalAttributeSetInitter = TSharedPtr<FAttributeSetInitter>(new FAttributeSetInitterDiscreteLevels());
}

Well, that’s a jackpot alright. We see that the default implementation uses FAttributeSetInitterDiscreteLevels which is a subclass of the empty FAttributeSetInitter. Unfortunately it does (and is limited to) exactly what the name implies - discrete levels. This is evident from its PreloadAttributeSetData function. I am not going to paste it but you can find it in AttributeSet.cpp. In short, it takes all the rich curves from the UCurveTable array that it’s given and just saves out the level values that are specifically defined. This means bye bye curve data and no infinite levels. But the solution is obvious - create a custom subclass of the FAttributeSetInitter, create an override to UAbilitySystemGlobals::AllocAttributeSetInitter() and provide the aforementioned custom initter subclass.

Of course, this means that we need to subclass UAbilitySystemGlobals too. This bit is a little trickier, as we need to tell the system to use our own UAbilitySystemGlobals subclass. This is done via the DefaultGame.ini config file, namely so:


[/Script/GameplayAbilities.AbilitySystemGlobals]
+AbilitySystemGlobalsClassName=/Script/Bla.BlaAbilitySystemGlobals

Where *Bla *is replaced by your game module name and *BlaAbilitySystemGlobals *by the name of your subclass. So now that we have that it’s trivial to override AllocAttributeSetInitter and provide a custom *FAttributeSetInitter * struct. One important thing to note - the ability system globals need to be manually initialized. This is best done via a game instance, so you will have to subclass UGameInstance and override its Init() function like so:


void UBlaGameInstance::Init()
{
	Super::Init();
	UAbilitySystemGlobals& ASG = UAbilitySystemGlobals::Get();
	if (!ASG.IsAbilitySystemGlobalsInitialized())
	{
		ASG.InitGlobalData();
	}
}

It is important to add that IsAbilitySystemGlobalsInitialized() check since the ASG object persists across PIE runs, so you want to only initialize it once.

So far so good… so let’s take a look at the custom attribute initter.


struct BLA_API FAttributeSetInitterCurveEval : public FAttributeSetInitter


I named it “CurveEval” to clarify that, instead of storing each individual discrete value, it’s actually storing the raw curve data.

Before we can preload any of the data, we need to figure out how to store it. Following the attribute initter instruction comment, we’ll structure the data as follows:


		struct FPropertyCurvePair
		{
			FPropertyCurvePair(UProperty* InProperty, FRichCurve* InCurve)
				: Property(InProperty), Curve(InCurve)
			{
			}

			UProperty*	Property;
			FRichCurve*	Curve;
		};


This is the lowest building block - a single attribute (UProperty) tied to some raw curve data. This is further contained in a FAttributeDefaultCurveList, which has a TArray and a few utility functions:



	struct FAttributeDefaultCurveList
	{

		struct FPropertyCurvePair
		{
			FPropertyCurvePair(UProperty* InProperty, FRichCurve* InCurve)
				: Property(InProperty), Curve(InCurve)
			{
			}

			UProperty*	Property;
			FRichCurve*	Curve;
		};

		void AddPair(UProperty* InProperty, FRichCurve* InValue)
		{
			List.Add(FPropertyCurvePair(InProperty, InValue));
		}

		TArray<FPropertyCurvePair>	List;
	};

But knowing the attribute is not enough, since technically two attribute sets can have attributes with identical names. Thus, we need to map the attributes to an attribute set, like this:


	struct FAttributeSetDefaultsCurveCollection
	{
		TMap<TSubclassOf<UAttributeSet>, FAttributeDefaultCurveList> DataMap;
	};

Last but not least, all of this is then mapped to a specific “group”, which, as discussed above can be Default, Hero1, Hero2 etc. It is basically the first part of the Hero1.Health.MaxHealth table row identifier:


TMap<FName, FAttributeSetDefaultsCurveCollection>	Defaults;

Now that all the supporting data structures are in place (those are all properties of our FAttributeSetInitterCurveEval) we can finally preload some data… almost. We need one more support function. Remember in the initter instruction comment it said that the set (middle tag in the table id string) is a “partial match” of the attribute set name. Well, to check for this partial match, we write the following function:


TSubclassOf<UAttributeSet> BlaFindBestAttributeClass(TArray<TSubclassOf<UAttributeSet> >& ClassList, FString PartialName)
{
	for (auto Class : ClassList)
	{
		if (Class->GetName().Contains(PartialName))
		{
			return Class;
		}
	}

	return nullptr;
}

Notice that there is no class on the function. This is because it’s written directly in the .cpp file. In AttributeSet.cpp there’s already a FindBestAttributeClass function like this, but since it’s classless it isn’t accessible, we need to duplicate it in our own code.

Now, on to the preloading. Most of the code has been copied from the DiscreteLevels initter but let’s take it step by step, starting with the PreloadAttributeSetData function:


        if (!ensure(CurveData.Num() > 0))
	{
		return;
	}

	/**
	*	Get list of AttributeSet classes loaded
	*/

	TArray<TSubclassOf<UAttributeSet> >	ClassList;
	for (TObjectIterator<UClass> ClassIt; ClassIt; ++ClassIt)
	{
		UClass* TestClass = *ClassIt;
		if (TestClass->IsChildOf(UAttributeSet::StaticClass()))
		{
			ClassList.Add(TestClass);
			/*#if !(UE_BUILD_SHIPPING || UE_BUILD_TEST)
						// This can only work right now on POD attribute sets. If we ever support FStrings or TArrays in AttributeSets
						// we will need to update this code to not use memcpy etc.
						for (TFieldIterator<UProperty> PropIt(TestClass, EFieldIteratorFlags::IncludeSuper); PropIt; ++PropIt)
						{
							if (!PropIt->HasAllPropertyFlags(CPF_IsPlainOldData))
							{
								ABILITY_LOG(Error, TEXT("FAttributeSetInitterDiscreteLevels::PreloadAttributeSetData Unable to Handle AttributeClass %s because it has a non POD property: %s"),
									*TestClass->GetName(), *PropIt->GetName());
								return;
							}
						}
			#endif*/
		}
	}

The first ensure is there just to prevent you from passing in empty data. After that, we load ALL UAttributeSet subclasses so that we can check against all their attributes.Note the part that is commented out. It will effectively check if a property is not a float, and just drop everything if it runs into that. It seems that the FGameAttributeData approach is newer and that in the past attribute sets only had floats. This means that the DiscreteLevel initter code WILL NOT WORK if you are using anything but float attributes (which you should, FGameAttributeData is the more robust system), so you’ll have to subclass an initter no matter what.

Once all the attribute set classes are gathered, let’s loop through the curve table array… (I’ve added comments in the code)



        //Iterate over the table array...
        for (const UCurveTable* CurTable : CurveData)
	{
                //Iterate over the individual rows in a table...
		for (auto It = CurTable->RowMap.CreateConstIterator(); It; ++It)
		{
                        //The entire row name, i.e. Class.Player.MaxHealth
			FString RowName = It.Key().ToString();
			FString ClassName;
			FString SetName;
			FString AttributeName;
			FString Temp;

                        //Split the RowName into ClassName (Class) and the put the rest in Temp (Player.MaxHealth)
			RowName.Split(TEXT("."), &ClassName, &Temp);
                        //Split the remainder into the SetName (Player) and the AttributeName (MaxHealth)
			Temp.Split(TEXT("."), &SetName, &AttributeName);

                        //If some of these ended up unpopulated just disregard this row...
			if (!ensure(!ClassName.IsEmpty() && !SetName.IsEmpty() && !AttributeName.IsEmpty()))
			{
				ABILITY_LOG(Verbose, TEXT("FAttributeSetInitterDiscreteLevels::PreloadAttributeSetData Unable to parse row %s in %s"), *RowName, *CurTable->GetName());
				continue;
			}

			// Find the AttributeSet
			TSubclassOf<UAttributeSet> Set = BlaFindBestAttributeClass(ClassList, SetName);
			if (!Set)
			{
				// This is ok, we may have rows in here that don't correspond directly to attributes
				ABILITY_LOG(Verbose, TEXT("FAttributeSetInitterDiscreteLevels::PreloadAttributeSetData Unable to match AttributeSet from %s (row: %s)"), *SetName, *RowName);
				continue;
			}

			// Find the UProperty
			UProperty* Property = FindField<UProperty>(*Set, *AttributeName);
                        //The IsSupportedProperty() just does: return (Property && (Cast<UNumericProperty>(Property) || FGameplayAttribute::IsGameplayAttributeDataProperty(Property)));
                        //meaning "is this a number of a FGameplayAttribute?"
			if (!IsSupportedProperty(Property))
			{
				ABILITY_LOG(Verbose, TEXT("FAttributeSetInitterDiscreteLevels::PreloadAttributeSetData Unable to match Attribute from %s (row: %s)"), *AttributeName, *RowName);
				continue;
			}

			FRichCurve* Curve = It.Value();
			FName ClassFName = FName(*ClassName);
                        //Get the rich curve collection corresponding to our ClassName (or create it)
			FAttributeSetDefaultsCurveCollection& DefaultCollection = Defaults.FindOrAdd(ClassFName);
			//Find the attribute list matching the current  UAttributeSet
                        FAttributeDefaultCurveList* DefaultDataList = DefaultCollection.DataMap.Find(Set);
			if (DefaultDataList == nullptr)
			{
                                 //If there is no list matching this attribute set... create it.
				ABILITY_LOG(Verbose, TEXT("Initializing new default set for %s. PropertySize: %d.. DefaultSize: %d"), *Set->GetName(), Set->GetPropertiesSize(), UAttributeSet::StaticClass()->GetPropertiesSize());

				DefaultDataList = &DefaultCollection.DataMap.Add(Set);
			}

			// Import curve value into default data
                        //Just add the current property  together with its matching curve to the attribute list.
			check(DefaultDataList);
			DefaultDataList->AddPair(Property, Curve);
		}
	}

Well, that wasn’t so bad, just some nested iteration. The *InitAttributeSetDefaults *isn’t that complex either. This function exists to initialize all the attributes in a set. Remember earlier that this is the function that we call in our character code, once we supply our ASC with all the attribute sets that it’ll hold. Again, the comments are in the code:


	
          //The profiler counter is commented out here, even though it exists in the Discrete Levels version. It seems that, despite these stats existing in AbilitySystemStats.h, 
          //they're not exported out of the module so they can't be used in your game module. You'll have to make your own stats if you want to profile this. I haven't gotten around to doing that yet.
         //SCOPE_CYCLE_COUNTER(STAT_InitAttributeSetDefaults);
	check(AbilitySystemComponent != nullptr);

        //This whole block will look if the provided group exists in the preloaded data. If it doesn't it checks for the Default group. If that isn't there either, the whole operation is stopped.
	const FAttributeSetDefaultsCurveCollection* Collection = Defaults.Find(GroupName);
	if (!Collection)
	{
		ABILITY_LOG(Warning, TEXT("Unable to find DefaultAttributeSet Group %s. Failing back to Defaults"), *GroupName.ToString());
		Collection = Defaults.Find(FName(TEXT("Default")));
		if (!Collection)
		{
			ABILITY_LOG(Error, TEXT("FAttributeSetInitterDiscreteLevels::InitAttributeSetDefaults Default DefaultAttributeSet not found! Skipping Initialization"));
			return;
		}
	}

        //Iterate over all the spawned attribute sets of the provided ASC
	for (const UAttributeSet* Set : AbilitySystemComponent->SpawnedAttributes)
	{
                //Check our preloaded data to see if we have any curves for the givenn attribute set...
		const FAttributeDefaultCurveList* DefaultDataList = Collection->DataMap.Find(Set->GetClass());
		if (DefaultDataList)
		{
			ABILITY_LOG(Log, TEXT("Initializing Set %s"), *Set->GetName());
                        //We found data for the given attribute set. Iterate over it and populate the data
			for (auto& DataPair : DefaultDataList->List)
			{
				check(DataPair.Property);

				if (Set->ShouldInitProperty(bInitialInit, DataPair.Property))
				{
					FGameplayAttribute AttributeToModify(DataPair.Property);
					AbilitySystemComponent->SetNumericAttributeBase(AttributeToModify, DataPair.Curve->Eval(Level));
				}
			}
		}
	}

	AbilitySystemComponent->ForceReplication();

The interesting bit here is the ShouldInitProperty() function. The default implementation in AttributeSet.h just returns true, but we want to extend it a bit. Namely… all the attributes of mine that depend on other attributes (*VitalityHealthBonus *etc.) can’t be initialized with the character level. Even moreso since it can happen that *VitalityHealthBonus *is assigned before Vitality. Thus, for the UCoreAttributeSet, the ShouldInitProperty() function looks like this:


bool UCoreAttributeSet::ShouldInitProperty(bool FirstInit, UProperty* PropertyToInit) const
{
	if (FirstInit)
	{
		return PropertyToInit != VitalityHealthBonusAttribute().GetUProperty() &&
			PropertyToInit != HealthRegenPerVitalityAttribute().GetUProperty();
	}

	return true;
}

Finally, our initter has one final function of note: ApplyAttributeDefault. This is identical to the previous function except that it takes a FGameplayAttribute& parameter, i.e. it’s only for a single attribute. So, instead of the ShouldInitProperty() check near the end of the function, there’s a simple if (DataPair.Property == InAttribute.GetUProperty()) check.

Phew… so far so good. We have our initter so now we can finally initialize our attributes… but… initialize them with what? Creating the CSV is simple enough. I’ve attached some sample data [here](CT_GlobalAttributes.zip (959 Bytes)&stc=1&d=1493604812). But how to tell the system to use it?

If we dig around in the Ability System Globals, we’ll see the following:


void UAbilitySystemGlobals::InitAttributeDefaults()
{
 	bool bLoadedAnyDefaults = false;
 
	// Handle deprecated, single global table name
	if (GlobalAttributeSetDefaultsTableName.IsValid())
	{
		UCurveTable* AttribTable = Cast<UCurveTable>(GlobalAttributeSetDefaultsTableName.TryLoad());
		if (AttribTable)
		{
			GlobalAttributeDefaultsTables.Add(AttribTable);
			bLoadedAnyDefaults = true;
		}
	}

	// Handle array of global curve tables for attribute defaults
 	for (const FStringAssetReference& AttribDefaultTableName : GlobalAttributeSetDefaultsTableNames)
 	{
		if (AttribDefaultTableName.IsValid())
		{
			UCurveTable* AttribTable = Cast<UCurveTable>(AttribDefaultTableName.TryLoad());
			if (AttribTable)
			{
				GlobalAttributeDefaultsTables.Add(AttribTable);
				bLoadedAnyDefaults = true;
			}
		}
 	}
	
	if (bLoadedAnyDefaults)
	{
		// Subscribe for reimports if in the editor
#if WITH_EDITOR
		if (GIsEditor && !RegisteredReimportCallback)
		{
			GEditor->OnObjectReimported().AddUObject(this, &UAbilitySystemGlobals::OnTableReimported);
			RegisteredReimportCallback = true;
		}
#endif


		ReloadAttributeDefaults();
	}
}

void UAbilitySystemGlobals::ReloadAttributeDefaults()
{
	AllocAttributeSetInitter();
	GlobalAttributeSetInitter->PreloadAttributeSetData(GlobalAttributeDefaultsTables);
}

Great! There’s where our initter’s PreloadAttributeSetData is called. It gets provided a curve table array called GlobalAttributeDefaultsTables. This is initialized from a TArray called GlobalAttributeSetDefaultsTableNames. This one’s another config variable and is set in a similar fashion as the class above, in the DefaultGame.ini file:


GlobalAttributeSetDefaultsTableNames=/Game/Path/To/Your/Imported/Curve/Table/CT_GlobalAttributes.CT_GlobalAttributes

If you want multiple tables (since it’s an array property) you’d do:


GlobalAttributeSetDefaultsTableNames=/Game/Path/To/Your/Imported/Curve/Table/CT_GlobalAttributes.CT_GlobalAttributes
+GlobalAttributeSetDefaultsTableNames=/Game/Path/To/Your/Imported/Curve/Table/OtherTable.OtherTable


Important note: It’s NOT CurveTable’/Game/Path/To/Your/Imported/Curve/Table/CT_GlobalAttributes.CT_GlobalAttributes’, all of t he fancy notation normally used in references is omitted and just the path is used.

Finally, with ALL THAT behind us, it’s time to address the attributes-depending-on-other attributes. Thankfully, this ended up being quite simple. As seen earlier, our attribute initter has a function which takes an attribute, a level and assigns a curve value to that attribute for the given level. This is great news, as it means that whenever you change Vitality, I can “level up” my VitalityHealthBonus using Vitality as my level. This is exactly what I do:


void UCoreAttributeSet::PreAttributeChange(const FGameplayAttribute& Attribute, float& NewValue)
{
	UBlaAbilitySystemComponent* Source = Cast<UBlaAbilitySystemComponent>(GetOwningAbilitySystemComponent());
	if (VitalityAttribute() == Attribute)
	{
		float HealthPercent = UBlaMath::Clamp01(Health.GetBaseValue() / GetMaxHealthIncludingVitalityBonus());
		FName ClassFName = Source->ClassTag.IsValid() ? UBlaGameplayStatics::GetTagLeafName(Source->ClassTag) : FName(TEXT("Default"));

		FGameplayAttribute VHBAttribute = VitalityHealthBonusAttribute();
		FGameplayAttribute VitalityHealthRegenAttribute = HealthRegenPerVitalityAttribute();

		FAttributeSetInitter* ASI = IGameplayAbilitiesModule::Get().GetAbilitySystemGlobals()->GetAttributeSetInitter();

		ASI->ApplyAttributeDefault(Source, VHBAttribute, ClassFName, NewValue);
		ASI->ApplyAttributeDefault(Source, VitalityHealthRegenAttribute, ClassFName, NewValue);
		Health.SetBaseValue(GetMaxHealthIncludingVitalityBonus() * HealthPercent);
	}
}

There is only one new thing here and that is the FName ClassFName = Source->ClassTag.IsValid() ? UBlaGameplayStatics::GetTagLeafName(Source->ClassTag) : FName(TEXT(“Default”)); line. Namely, what I do is, instead of manually typing in those “Hero1”, “Hero2” etc. identifier classes, I have them as a FGameplayTag ClassTag in my custom ASC subclass. Then I just get the leaf portion of the tag (e.g. get Hero1 out of Class.Hero1) via a custom gameplay statics function:



FName UBlaGameplayStatics::GetTagLeafName(const FGameplayTag& Tag)
{
	FString TagNameAsString = Tag.ToString();

	FString Left;
	FString Right;

	if (TagNameAsString.Split(FString(TEXT(".")), &Left, &Right, ESearchCase::IgnoreCase, ESearchDir::FromEnd))
	{
		return FName(*Right);
	}
	else
	{
		return Tag.GetTagName();
	}
}

Almost done… the dependent attributes are all properly set by our Vitality attribute… But how to store that attribute? To be more precise - if I put a point into Vitality in my game, that means the data in the curve table is no longer correct. I need a way to override it. This is a player-only thing, as my enemies will always get their data out of tables. As you might have noticed by now, I don’t have “player level”, but rather just abilities I put points into.

The first step there is to define which attributes should be saved. For that I have a small struct, like so:


USTRUCT(BlueprintType)
struct BLA_API FSavedAttribute
{
	GENERATED_BODY()
	
public:
	UPROPERTY(EditAnywhere, BlueprintReadOnly, Category = "Bla")
	FGameplayAttribute Attribute;
	UPROPERTY(BlueprintReadOnly, Category = "Bla")
	float SavedValue;
};

The FGameplayAttribute struct gives you an attribute picker in the details panel, which makes this approach very convenient. My APlayerCharacterBase class now simply has a TArray in which I tell it which attributes should be saved. Furthermore, in my character I got 2 more functions to reading and writing the values of those attributes. I won’t go into how to set up the whole save game side of things as this is already way too long but this is the attribute save / load part:


//Called when saving
bool APlayerCharacterBase::GetSavedAttributesCurrentValues(TArray<FSavedAttribute>& OutAttributes)
{
	if (AbilitySystem == nullptr)
	{
		return false;
	}

	for (FSavedAttribute& SA : AttributesToSave)
	{
		if (AbilitySystem->HasAttributeSetForAttribute(SA.Attribute))
		{
			SA.SavedValue = AbilitySystem->GetNumericAttributeBase(SA.Attribute);
		}
	}
	
	OutAttributes = AttributesToSave;
	return OutAttributes.Num() > 0;
}

//Called when reading loaded data
void APlayerCharacterBase::PopulateSavedAttributes(const TArray<FSavedAttribute>& Attributes)
{
        if (AbilitySystem == nullptr)
	{
		return false;
	}
	for (const FSavedAttribute& Attr : Attributes)
	{
		AbilitySystem->SetNumericAttributeBase(Attr.Attribute, Attr.SavedValue);

	}
}

The saving part is pretty straightforward… we query our ASC to check if it even has the attribute set for a given attribute, and if it does, just fetch the current value.

The loading part is also quite simple. Just iterate over the given attributes and set them in the ASC. This will call our PreAttributeBaseChange and PreAttributeChange and make sure all the attributes are “leveled up”.

Conclusion

As I said, attribute sets can become deceptively difficult. However, through them we’ve explored the FAttributeSetInitter and half a dozen other things. There is some more work to be done on this. Namely, I need to make sure that the constructor sets Health, Mana and Stamina to -1.f initially, so that I can initialize them properly the first time I set MaxHealth / MaxMana / MaxStamina. But this should be more than enough for now. After all this you should be able to create your own attribute sets, your own attribute set initters to initialize your sets with any data you might need as well as apply all the necessary math to your attributes within the attribute sets themselves. Hope it was useful!

I’ll update this thread once I finish more coherent pieces of the system. If you have any questions let me know.

3 Likes

This is a great addition. Thanks for taking the time to share!

Good stuff. Great stuff, even, especially since the entirity of the AttributeSet initializing sector of the system I didn’t bother myself with, so it’s nice that you cover some of what I left there untouched. I may actually link this thread at the end of my thread if you don’t mind, the things you discuss here seem more specific and advanced than my thread, which is more meant as an entry-level overview to get people started with and talking about the system.

Although, I do want to ask, what are the concrete benefits of FGameplayAttributeData over a simple float? Float attributes support base values and temporary modifers either way without this struct. Don’t get me wrong, this is an easy thing to change one way or the other and I likely will do it if only because it looks tidier and doing the float thing may well be the “old” way to do it(I mean, I wouldn’t know, but upgrading to a struct from a float makes sense to me), but is there any pitfalls or limitations with floats that are not present with the structs, or are there helpful functions available which you can only use with the FGameplayAttributeData struct? This didn’t quite become clear to me I think.

Hey guys, thanks for the support, glad you find the info useful. Before I answer the questions, let me just correct a mistake I made in the OP - you should NOT call ShutdownModule() in your GameInstance’s ShutDown() function, it creates nasty side-effects (Thank you Discord folks for pointing this out). However, the UAbilitySystemGlobals object still persists across PIE runs, so you must ensure that you only call the init function on it once. Thankfully, there’s a function for that which I’ve missed earlier. The new code to put in your GameInstance looks like this:


void UBlaGameInstance::Init()
{
	Super::Init();
	UAbilitySystemGlobals& ASG = UAbilitySystemGlobals::Get();
	if (!ASG.IsAbilitySystemGlobalsInitialized()) //This is the function I've missed
	{
		ASG.InitGlobalData();
	}
}

I’ve update the code in the OP to address this. As for your question @KZJ - a ton of the code written regarding attributes mentions that the FGameplayAttributeData stuff is newer and comments say that Epic might deprecated float attributes entirely in favor of those. It seems to be one of those “big changes” that they keep saying will be made on the system in the future. :smiley: I’d personally just stick with FGameplayAttributeData for all future work just in case I wake up one day to see a deprecation warning on the floats.

As for whether there are any pitfalls… I wouldn’t really know. I believe the original reason for this was due to the fact that with a float, you can only store either the base or the current value, so if you have a bunch of modifiers on top of an attribute, getting the base value underneath all that might be a) costly (have to reverse all the modifiers) or b) impossible (?). Having access to the base at all times is useful.

Also - yes, of course you can link this in the original thread. I already posted it there too. :smiley:

Ah I see, so a plain old “it’s just the newer, tidier implementation”. Gotcha. You actually can get both regular value and base value of a float attribute relatively easily if the helper functions you can call within blueprints are any indication, but I actually don’t know if that is particularly expensive. I’d have to check.

Either way, as I said, it shouldn’t really make that big of a difference one way or the other, switching from float to structs should be no big rewrite. UE4’s macros thankfully have the courtesy to work pretty much identically on single variables and structs alike for most purposes such as replication and general UPROPERTY tags.

I guess I might change/mention that in my guide as well then.

Well honestly, regardless of the real performance overhead, having to “undo” a bunch of GEs to get the base value is always more error-prone than to just have it lying around there at O(1) access time and at the cost of a few bits more. :stuck_out_tongue: Tomorrow I’ll probably be expanding this thread with a few new findings regarding “non-hardcoded” attributes that I have to deal with (and use it as a nice segue into GEs).

So I am trying something new, I have a few sets of Attributes that behave in similar ways. For our purposes let’s talk about Health and Energy. I would like to reuse the common functionality, and I tried to do so by inheriting from a common parent. When I did this via inheritance, the editor could only see the values in the parent class, and not the child classes. And since, as far as I can tell, you cannot use multiple instances of an attribute set in a class, I don’t know how to do it. I have instead built one class and then copied all the code over and over, and replace the names.

Is there another way to do this?

Regards,

giffen

Hi all,
when I use config set Attribute default value, the error like:

		#if !(UE_BUILD_SHIPPING || UE_BUILD_TEST) 						// This can only work right now on POD attribute sets. If we ever support FStrings or TArrays in AttributeSets 						// we will need to update this code to not use memcpy etc. 						for (TFieldIterator&lt;UProperty&gt; PropIt(TestClass, EFieldIteratorFlags::IncludeSuper); PropIt; ++PropIt) 						{ 							if (!PropIt-&gt;HasAllPropertyFlags(CPF_IsPlainOldData)) 							{ 								ABILITY_LOG(Error, TEXT("FAttributeSetInitterDiscreteLevels::PreloadAttributeSetData Unable to Handle AttributeClass %s because it has a non POD property: %s"), 									*TestClass-&gt;GetName(), *PropIt-&gt;GetName()); 								return; 							} 						} 			#endif        but i check my curveTable struct, no error, and if i delete the check(#if !(UE_BUILD_SHIPPING || UE_BUILD_TEST)), build sucessful. so what issue?

Hey,

The issue is that the default FAttributeSetInitter relies on your attribute sets to use floats instead of FAttributes. You will have to implement your own initter and allocate that instead of the default one in your AbilitySystemGlobals.

Not that I know of but I don’t think these were ever meant to be inherited. What is your common stuff? If it’s only functions then you could delegate those in some function library.

I also wanted to take this chance to post a small(ish) update to this thread. By now everyone interested in GA has seen the ActionRPG sample and since my own needs have evolved over time I’ve adopted a similar system for applying entire sets of GameplayEffects via a TMap (just how it’s done in the ActionRPG sample). However I’ve run into a snag and the solution to that snag involves some digging around so here’s what I found. This assumes that you’ve dug through the ActionRPG sample as I’ll be referencing code from there.

The problem: Even though my Gameplay Effects have a gameplay cue defined, and it works great when I hit enemies with said gameplay effects… it doesn’t help me if I hit something that doesn’t have a UAbilitySystemComponent, i.e. a wall, a rock or a destructible mesh.

The solution (short version): If something is hit that is not an enemy with a ASC (Ability System Component), play the hit cue manually on the OWNING ASC instead of the target.

The solution (long version):

The problem essentially stems from the fact that when applying a GE to a target, it will just short out and fail if that target doesn’t have a ASC on it. The first step of the solution is to cache away those actors. I did that in my gameplay ability, where I (and the ActionRPG sample) build up the target data:


 if (Container.TargetType.Get() != nullptr)
        {
            TArray<FHitResult> HitResults;
            TArray<AActor*> TargetActors;
            const UWaTargetType* TargetTypeCDO = Container.TargetType.GetDefaultObject();
            TargetTypeCDO->GetTargets(OwningWaCharacter, EventData, HitResults, TargetActors);
            ReturnSpec.AddTargets(HitResults, TargetActors);

            //This is new
            for (const FHitResult& Result : HitResults)
            {
                if (UAbilitySystemGlobals::GetAbilitySystemComponentFromActor(Result.GetActor()) == nullptr)
                {
                    NonAbilityTargets.Emplace(FNonAbilityTarget(CueTags, Result));
                }
            }

            for (AActor* Act : TargetActors)
            {
                if (UAbilitySystemGlobals::GetAbilitySystemComponentFromActor(Act) == nullptr)
                {
                    NonAbilityTargets.Emplace(FNonAbilityTarget(CueTags, Act));
                }
            }
            //New ends here


        }

Basically I am taking any actors I have, checking if they have a ASC and if not, storing them into my own NonAbilityTargets array. This looks like this:


//A struct for temporary holding of actors (and transforms) of actors that we hit
//that don't have an ASC. Used for environment impact GameplayCues.
struct FNonAbilityTarget
{
    FGameplayTagContainer CueContainer;
    TWeakObjectPtr<AActor> TargetActor;
    FHitResult TargetHitResult;
    bool bHasHitResult;

public:
    FNonAbilityTarget()
        : CueContainer(FGameplayTagContainer())
        , TargetActor(nullptr)
        , TargetHitResult(FHitResult(ENoInit::NoInit))
        , bHasHitResult(false)
    {
    }

    FNonAbilityTarget(const FGameplayTagContainer& InCueTags, const FHitResult& InResult)
        : CueContainer(InCueTags)
        , TargetActor(TWeakObjectPtr<AActor>(InResult.GetActor()))
        , TargetHitResult(InResult)
        , bHasHitResult(true)
    {
    }

    FNonAbilityTarget(const FGameplayTagContainer& InCueTags, AActor* InActor)
        : CueContainer(InCueTags)
        , TargetActor(TWeakObjectPtr<AActor>(InActor))
        , TargetHitResult(FHitResult(ENoInit::NoInit))
        , bHasHitResult(false)
    {
    }
};

//Note the inline allocator. This will put 1 element on the stack and the rest (if any) on the heap. Since you'll be hitting a single target 90% of the time this is a big performance win
TArray<FNonAbilityTarget, TInlineAllocator<1>> NonAbilityTargets;

Last but not least, where the gameplay effects actually get applied (the ApplyEffectContainerSpec function in the ActionRPG sample), I have this additional code:


check(CurrentActorInfo);
    for (const FNonAbilityTarget& NAT : NonAbilityTargets)
    {
        FGameplayCueParameters GCParams;
        UMyAbilitySystemGlobals& WASG = static_cast<UMyAbilitySystemGlobals&>(UMyAbilitySystemGlobals::Get());


        if (NAT.bHasHitResult)
        {
            WASG.InitGameplayCueParameters_HitResult(GCParams, this, NAT.TargetHitResult);

        }
        else
        {
            WASG.InitGameplayCueParameters_Actor(GCParams, this, NAT.TargetActor.Get());
        }

        for (auto It = NAT.CueContainer.CreateConstIterator(); It; ++It)
        {
            const FGameplayTag& Tag = *It;
            GCParams.OriginalTag = Tag;
            CurrentActorInfo->AbilitySystemComponent->ExecuteGameplayCue(Tag, GCParams);
        }

    }

This basically just creates a FGameplayCueParameters struct, populates all the required info and fires off ExecuteGameplayCue to the owner ASC, since the target doesn’t have one. Now the magic happens in the population of the FGameplayCueParameters struct. If we look at the DEFAULT UAbilitySystemGlobals we find this:


    /** Initialize GameplayCue Parameters */
    virtual void InitGameplayCueParameters(FGameplayCueParameters& CueParameters, const FGameplayEffectSpecForRPC &Spec);
    virtual void InitGameplayCueParameters_GESpec(FGameplayCueParameters& CueParameters, const FGameplayEffectSpec &Spec);
    virtual void InitGameplayCueParameters(FGameplayCueParameters& CueParameters, const FGameplayEffectContextHandle& EffectContext);

So this is a design paradigm already present in the system. Their own default implementation just takes variables from the function parameters and puts them into the outgoing CueParameters struct, like so:


void UAbilitySystemGlobals::InitGameplayCueParameters(FGameplayCueParameters& CueParameters, const FGameplayEffectContextHandle& EffectContext)
{
    if (EffectContext.IsValid())
    {
        // Copy Context over wholesale. Projects may want to override this and not copy over all data
        CueParameters.EffectContext = EffectContext;
    }
}

Having seen this it was a simple matter of creating my own variations that extract data from actors, hit results etc.


virtual void InitGameplayCueParameters_Transform(FGameplayCueParameters& CueParameters, UGameplayAbility* Ability,  const FTransform& DestinationTransform);
virtual void InitGameplayCueParameters_HitResult(FGameplayCueParameters& CueParameters, UGameplayAbility* Ability, const FHitResult& HitResult);
virtual void InitGameplayCueParameters_Actor(FGameplayCueParameters& CueParameters, UGameplayAbility* Ability, const AActor* InTargetActor);


That’s all there is to it… almost. Some special attention needs to be placed on the HitResult and Actor versions of said functions (Relevant comments in the code):


void UMyAbilitySystemGlobals::InitGameplayCueParameters_HitResult(FGameplayCueParameters& CueParameters,
    UGameplayAbility* Ability, const FHitResult& HitResult)
{
    if (Ability == nullptr)
    {
        return;
    }

    FGameplayAbilityActorInfo CurrentActorInfo = Ability->GetActorInfo();
    check(CurrentActorInfo.AbilitySystemComponent.IsValid());

    CueParameters.AbilityLevel = Ability->GetAbilityLevel();
    CueParameters.EffectCauser = CurrentActorInfo.AvatarActor;
    CueParameters.EffectContext = CurrentActorInfo.AbilitySystemComponent->MakeEffectContext();
    CueParameters.Instigator = CurrentActorInfo.OwnerActor;
    CueParameters.SourceObject = Ability;

    //My gameplay cues, namely hit impacts, depend on the location and hit normal. In this case, we just extract this stuff from the hit result, as seen below.
    CueParameters.Location = HitResult.Location;
    CueParameters.Normal = HitResult.ImpactNormal;
    CueParameters.TargetAttachComponent = HitResult.GetComponent();
    CueParameters.PhysicalMaterial = HitResult.PhysMaterial;

    //Important: Even though this does not come from a GameplayEffect, with create a context a couple lines above and here we add the hit result 
    //to the context, mainly because gameplay cue notifies look for the hit result inside the effect context to know where they should spawn particles etc.
    CueParameters.EffectContext.AddHitResult(HitResult);
}

void UWaAbilitySystemGlobals::InitGameplayCueParameters_Actor(FGameplayCueParameters& CueParameters, 
    UGameplayAbility* Ability, const AActor* InTargetActor)
{
    if (Ability == nullptr)
    {
        return;
    }

    FGameplayAbilityActorInfo CurrentActorInfo = Ability->GetActorInfo();
    check(CurrentActorInfo.AbilitySystemComponent.IsValid());

    CueParameters.AbilityLevel = Ability->GetAbilityLevel();
    CueParameters.EffectCauser = CurrentActorInfo.AvatarActor;
    CueParameters.EffectContext = CurrentActorInfo.AbilitySystemComponent->MakeEffectContext();
    CueParameters.Instigator = CurrentActorInfo.OwnerActor;
    CueParameters.SourceObject = Ability;

     //Since we don't have a hit result in the actor version, we do our best estimates by...

    //...calculating the location and direction with simple center-to-center math...
    CueParameters.Location = InTargetActor->GetActorLocation();
    CueParameters.Normal =
        (CurrentActorInfo.AvatarActor->GetActorLocation() - CueParameters.Location).GetSafeNormal();

    //...use the target's root as an attachment point
    CueParameters.TargetAttachComponent = InTargetActor->GetRootComponent();

    //...and if the root is an actual primitive (it will be in most cases unless you FUBAR'd something)...
    if (UPrimitiveComponent* TargetPrimitive = Cast<UPrimitiveComponent>(CueParameters.TargetAttachComponent))
    {
        //...just get the first material off of it...
        UMaterialInterface* MInt = (TargetPrimitive->GetNumMaterials() > 0) ?
            TargetPrimitive->GetMaterial(0) : nullptr;

        //...and get its physical material...
        if (MInt != nullptr)
        {
            CueParameters.PhysicalMaterial = TWeakObjectPtr<UPhysicalMaterial>(MInt->GetPhysicalMaterial());
        }
    }

    //...unless there are no materials or physical materials defined, in which case just get the engine default.
    if (!CueParameters.PhysicalMaterial.IsValid())
    {
        CueParameters.PhysicalMaterial = GEngine->DefaultPhysMaterial;
    }
}

Hope this helps someone as it one area where I feel you kinda have to fight the GA system to get proper cues on “plain old objects”.

Hi, if you could please help me I have been having a truly awful time trying to implement this attributes system to use the defaults that I want it to, I’ve spent days on this tearing my hair out trying different tutorials and at this point I can’t take it anymore.

I am not trying to do anything insane, I literally just want to initialize a base set of values for my attributes. Not one that varies by level or curve or any of that. I just want to set MoveSpeed to be 1, TimeScale to be 1, etc for ALL characters. I already set up a GameplayEffect to set specific values like Max Health based on values on my character blueprints. I tried to use a DataTable before but that wasn’t working so I figured I’d do it the way they want and use a CurveTable, but that also isn’t working.

The problem is that my game has a very very large amount of attributes, and I don’t want to set all of them on a GameplayEffect because it could load a hundred attributes for every monster, have to replicate those, etc. and I’m not super familiar with how costly all of that is. I figured that it would be infinitely cheaper to have them just use a basic defaults set that would all get done as a batch initially and not have to be replicated/other issues.

What’s happened:
I’ve followed your tutorial as well as I can, though I am not changing the curve type for the table. I’ve extended the GameInstance, the AbilitySystemGlobals, the FAttributeSetInitter, all of it. I modified virtually no functionality from the FAttributeSetInitterDiscreteLevels, though I still set it up so that I could change it because I read what you wrote that their version is deprecated and only works with the old float values system.

I’m stuck right now. Two issues.
1 - When you said “The issue is that the default FAttributeSetInitter relies on your attribute sets to use floats instead of FAttributes. You will have to implement your own initter and allocate that instead of the default one in your AbilitySystemGlobals.” ---- I have implemented my own initter and allocated it. But I don’t know what code I should change to fix this problem? Do I just remove the POD check, or will that cause an issue down the line? I am not advanced with C++ and really can’t tell just from looking at it why that check is even there, so I am not sure what it affects.
If you could please just let me know specifically how I could get around that problem, or if it is even something I need to worry about, I would greatly appreciate it.

2 - I am currently, after following through with all of the changes to .ini files and extensions of classes and everything, able to compile in VS. However, when I click to compile in UE4, it gives me an error. I have attached a picture of this error, but below is the longest part:

MLSTAbilitySystemGlobals.cpp.obj : error LNK2001: unresolved external symbol “struct FThreadSafeStaticStat<struct FStat_STAT_InitAttributeSetDefaults> StatPtr_STAT_InitAttributeSetDefaults” (?StatPtr_STAT_InitAttributeSetDefaults@@3U?$FThreadSafeStaticStat@UFStat_STAT_InitAttributeSetDefaults@@@@A)

I really have no idea where this is coming from. Like I said, no error in VS. I’ve got all of the includes and everything. This is my class that extends the AbilitySystemGlobals, and it’s where I’ve copy-pasted (and renamed) all of the functions from FAttributeSetInitterDiscreteLevels. I’ve attached all of the code in the file below, but there’s no real need to peruse it because like I said I only changed the names. The header file is also super simple and just overrides.I tried deleting my binaries folder but couldn’t rebuild my project after that (even though I could get it to compile in VS) so I restored it, and now I’m at a major loss as to what to do.

Any advice or anything on this to help? I would really appreciate it. I appreciate so much the work that you have already done with this tutorial.



#include "MLSTAbilitySystemGlobals.h"
#include "AttributeSet.h"
#include "Stats/StatsMisc.h"
#include "EngineDefines.h"
#include "Engine/Blueprint.h"
#include "AssetData.h"
#include "Engine/ObjectLibrary.h"
#include "VisualLogger/VisualLogger.h"
#include "AbilitySystemLog.h"
#include "GameplayEffectAggregator.h"
#include "AbilitySystemStats.h"
#include "UObject/UObjectHash.h"
#include "UObject/UObjectIterator.h"
#include "AbilitySystemGlobals.h"
#include "AbilitySystemComponent.h"
#include "AbilitySystemTestAttributeSet.h"




/** Initialize FAttributeSetInitter. This is virtual so projects can override what class they use */
void UMLSTAbilitySystemGlobals::AllocAttributeSetInitter()
{
    GlobalAttributeSetInitter = TSharedPtr<FAttributeSetInitter>(new FAttributeSetInitterBasic());
}











// FAttributeSetInitter Implementation

// ------------------------------------------------------------------------------------
//
// ------------------------------------------------------------------------------------
TSubclassOf<UAttributeSet> FAttributeSetInitterBasic::FindBestAttributeClassBasic(TArray<TSubclassOf<UAttributeSet> >& ClassList, FString PartialName)
{
    for (auto Class : ClassList)
    {
        if (Class->GetName().Contains(PartialName))
        {
            return Class;
        }
    }

    return nullptr;
}



void FAttributeSetInitterBasic::PreloadAttributeSetData(const TArray<UCurveTable*>& CurveData)
{
    if (!ensure(CurveData.Num() > 0))
    {
        return;
    }

    /**
     *    Get list of AttributeSet classes loaded
     */

    TArray<TSubclassOf<UAttributeSet> >    ClassList;
    for (TObjectIterator<UClass> ClassIt; ClassIt; ++ClassIt)
    {
        UClass* TestClass = *ClassIt;
        if (TestClass->IsChildOf(UAttributeSet::StaticClass()))
        {
            ClassList.Add(TestClass);
#if !(UE_BUILD_SHIPPING || UE_BUILD_TEST)
            // This can only work right now on POD attribute sets. If we ever support FStrings or TArrays in AttributeSets
            // we will need to update this code to not use memcpy etc.
            for (TFieldIterator<UProperty> PropIt(TestClass, EFieldIteratorFlags::IncludeSuper); PropIt; ++PropIt)
            {
                if (!PropIt->HasAllPropertyFlags(CPF_IsPlainOldData))
                {
                    ABILITY_LOG(Error, TEXT("FAttributeSetInitterBasic::PreloadAttributeSetData Unable to Handle AttributeClass %s because it has a non POD property: %s"),
                        *TestClass->GetName(), *PropIt->GetName());
                    return;
                }
            }
#endif
        }
    }

    /**
     *    Loop through CurveData table and build sets of Defaults that keyed off of Name + Level
     */
    for (const UCurveTable* CurTable : CurveData)
    {
        for (auto It = CurTable->RowMap.CreateConstIterator(); It; ++It)
        {
            FString RowName = It.Key().ToString();
            FString ClassName;
            FString SetName;
            FString AttributeName;
            FString Temp;

            RowName.Split(TEXT("."), &ClassName, &Temp);
            Temp.Split(TEXT("."), &SetName, &AttributeName);

            if (!ensure(!ClassName.IsEmpty() && !SetName.IsEmpty() && !AttributeName.IsEmpty()))
            {
                ABILITY_LOG(Verbose, TEXT("FAttributeSetInitterBasic::PreloadAttributeSetData Unable to parse row %s in %s"), *RowName, *CurTable->GetName());
                continue;
            }

            // Find the AttributeSet

            TSubclassOf<UAttributeSet> Set = FindBestAttributeClassBasic(ClassList, SetName);
            if (!Set)
            {
                // This is ok, we may have rows in here that don't correspond directly to attributes
                ABILITY_LOG(Verbose, TEXT("FAttributeSetInitterBasic::PreloadAttributeSetData Unable to match AttributeSet from %s (row: %s)"), *SetName, *RowName);
                continue;
            }

            // Find the UProperty
            UProperty* Property = FindField<UProperty>(*Set, *AttributeName);
            if (!IsSupportedProperty(Property))
            {
                ABILITY_LOG(Verbose, TEXT("FAttributeSetInitterBasic::PreloadAttributeSetData Unable to match Attribute from %s (row: %s)"), *AttributeName, *RowName);
                continue;
            }

            FRichCurve* Curve = It.Value();
            FName ClassFName = FName(*ClassName);
            FAttributeSetDefaultsCollection& DefaultCollection = Defaults.FindOrAdd(ClassFName);

            int32 LastLevel = Curve->GetLastKey().Time;
            DefaultCollection.LevelData.SetNum(FMath::Max(LastLevel, DefaultCollection.LevelData.Num()));

            //At this point we know the Name of this "class"/"group", the AttributeSet, and the Property Name. Now loop through the values on the curve to get the attribute default value at each level.
            for (auto KeyIter = Curve->GetKeyIterator(); KeyIter; ++KeyIter)
            {
                const FRichCurveKey& CurveKey = *KeyIter;

                int32 Level = CurveKey.Time;
                float Value = CurveKey.Value;

                FAttributeSetDefaults& SetDefaults = DefaultCollection.LevelData[Level - 1];

                FAttributeDefaultValueList* DefaultDataList = SetDefaults.DataMap.Find(Set);
                if (DefaultDataList == nullptr)
                {
                    ABILITY_LOG(Verbose, TEXT("Initializing new default set for %s%d]. PropertySize: %d.. DefaultSize: %d"), *Set->GetName(), Level, Set->GetPropertiesSize(), UAttributeSet::StaticClass()->GetPropertiesSize());

                    DefaultDataList = &SetDefaults.DataMap.Add(Set);
                }

                // Import curve value into default data

                check(DefaultDataList);
                DefaultDataList->AddPair(Property, Value);
            }
        }
    }
}

void FAttributeSetInitterBasic::InitAttributeSetDefaults(UAbilitySystemComponent* AbilitySystemComponent, FName GroupName, int32 Level, bool bInitialInit) const
{
    SCOPE_CYCLE_COUNTER(STAT_InitAttributeSetDefaults);
    check(AbilitySystemComponent != nullptr);

    const FAttributeSetDefaultsCollection* Collection = Defaults.Find(GroupName);
    if (!Collection)
    {
        ABILITY_LOG(Warning, TEXT("Unable to find DefaultAttributeSet Group %s. Failing back to Defaults"), *GroupName.ToString());
        Collection = Defaults.Find(FName(TEXT("Default")));
        if (!Collection)
        {
            ABILITY_LOG(Error, TEXT("FAttributeSetInitterBasic::InitAttributeSetDefaults Default DefaultAttributeSet not found! Skipping Initialization"));
            return;
        }
    }

    if (!Collection->LevelData.IsValidIndex(Level - 1))
    {
        // We could eventually extrapolate values outside of the max defined levels
        ABILITY_LOG(Warning, TEXT("Attribute defaults for Level %d are not defined! Skipping"), Level);
        return;
    }

    const FAttributeSetDefaults& SetDefaults = Collection->LevelData[Level - 1];
    for (const UAttributeSet* Set : AbilitySystemComponent->SpawnedAttributes)
    {
        const FAttributeDefaultValueList* DefaultDataList = SetDefaults.DataMap.Find(Set->GetClass());
        if (DefaultDataList)
        {
            ABILITY_LOG(Log, TEXT("Initializing Set %s"), *Set->GetName());

            for (auto& DataPair : DefaultDataList->List)
            {
                check(DataPair.Property);

                if (Set->ShouldInitProperty(bInitialInit, DataPair.Property))
                {
                    FGameplayAttribute AttributeToModify(DataPair.Property);
                    AbilitySystemComponent->SetNumericAttributeBase(AttributeToModify, DataPair.Value);
                }
            }
        }
    }

    AbilitySystemComponent->ForceReplication();
}

void FAttributeSetInitterBasic::ApplyAttributeDefault(UAbilitySystemComponent* AbilitySystemComponent, FGameplayAttribute& InAttribute, FName GroupName, int32 Level) const
{
    SCOPE_CYCLE_COUNTER(STAT_InitAttributeSetDefaults);

    const FAttributeSetDefaultsCollection* Collection = Defaults.Find(GroupName);
    if (!Collection)
    {
        ABILITY_LOG(Warning, TEXT("Unable to find DefaultAttributeSet Group %s. Failing back to Defaults"), *GroupName.ToString());
        Collection = Defaults.Find(FName(TEXT("Default")));
        if (!Collection)
        {
            ABILITY_LOG(Error, TEXT("FAttributeSetInitterBasic::InitAttributeSetDefaults Default DefaultAttributeSet not found! Skipping Initialization"));
            return;
        }
    }

    if (!Collection->LevelData.IsValidIndex(Level - 1))
    {
        // We could eventually extrapolate values outside of the max defined levels
        ABILITY_LOG(Warning, TEXT("Attribute defaults for Level %d are not defined! Skipping"), Level);
        return;
    }

    const FAttributeSetDefaults& SetDefaults = Collection->LevelData[Level - 1];
    for (const UAttributeSet* Set : AbilitySystemComponent->SpawnedAttributes)
    {
        const FAttributeDefaultValueList* DefaultDataList = SetDefaults.DataMap.Find(Set->GetClass());
        if (DefaultDataList)
        {
            ABILITY_LOG(Log, TEXT("Initializing Set %s"), *Set->GetName());

            for (auto& DataPair : DefaultDataList->List)
            {
                check(DataPair.Property);

                if (DataPair.Property == InAttribute.GetUProperty())
                {
                    FGameplayAttribute AttributeToModify(DataPair.Property);
                    AbilitySystemComponent->SetNumericAttributeBase(AttributeToModify, DataPair.Value);
                }
            }
        }
    }

    AbilitySystemComponent->ForceReplication();
}

TArray<float> FAttributeSetInitterBasic::GetAttributeSetValues(UClass* AttributeSetClass, UProperty* AttributeProperty, FName GroupName) const
{
    TArray<float> AttributeSetValues;
    const FAttributeSetDefaultsCollection* Collection = Defaults.Find(GroupName);
    if (!Collection)
    {
        ABILITY_LOG(Error, TEXT("FAttributeSetInitterBasic::InitAttributeSetDefaults Default DefaultAttributeSet not found! Skipping Initialization"));
        return TArray<float>();
    }

    for (const FAttributeSetDefaults& SetDefaults : Collection->LevelData)
    {
        const FAttributeDefaultValueList* DefaultDataList = SetDefaults.DataMap.Find(AttributeSetClass);
        if (DefaultDataList)
        {
            for (auto& DataPair : DefaultDataList->List)
            {
                check(DataPair.Property);
                if (DataPair.Property == AttributeProperty)
                {
                    AttributeSetValues.Add(DataPair.Value);
                }
            }
        }
    }
    return AttributeSetValues;
}


bool FAttributeSetInitterBasic::IsSupportedProperty(UProperty* Property) const
{
    return (Property && (Cast<UNumericProperty>(Property) || FGameplayAttribute::IsGameplayAttributeDataProperty(Property)));
}

Hi, follow-up to my last post.

I figured out my prior problem by removing the “SCOPE_CYCLE_COUNTER” line from all functions in my subclass of the AbilitySystemGlobals.

Now there have been countless additional problems which I’ve had to work out one by one to try and isolate a specific problem:

My Curve Tables are not loading. (These are just discrete level curve tables, nothing exciting or strange about them). I have detected this by overriding my InitAttributeDefaults() function and filling it with logs at different parts, to see where the problem is.

My DefaultGame.ini has this section:



[/Script/GameplayAbilities.AbilitySystemGlobals]
GlobalAttributeSetDefaultsTableNames=/Maelstrom/Content/GameplayAbilities/AttributeTables/BAS_Curve.BAS_Curve
+GlobalAttributeSetDefaultsTableNames=/Maelstrom/Content/GameplayAbilities/AttributeTables/TestCurve.TestCurve
+AbilitySystemGlobalsClassName=/Script/Zangies.MLSTAbilitySystemGlobals

I also tried it before with “/Game/” instead of “/Maelstrom/” and there was no change.

My logs show that the two paths here are found, but that they fail to load anything meaningful from them. This is an example of one of my curve tables:
[TABLE=“border: 1, cellpadding: 0, cellspacing: 0”]


		1

Default.Health
100

Default.HealthMax
100

Default.HealthRegen
0.01

Default.MoveSpeed
1

Default.StealthLevel
1

Default.Timescale
1



    // Handle array of global curve tables for attribute defaults
    for (const FSoftObjectPath& AttribDefaultTableName : GlobalAttributeSetDefaultsTableNames)
    {
        UE_LOG(LogTemp, Warning, TEXT("INIT: Starting loop for global curve tables.."))        // currently triggers, twice
        if (AttribDefaultTableName.IsValid())
        {
            UE_LOG(LogTemp, Warning, TEXT("INIT: Found valid table!"))        // currently triggers, twice
            if (AttribDefaultTableName.TryLoad())
            {
                UE_LOG(LogTemp, Warning, TEXT("INIT: Loaded something valid."))        // NOT triggering

            }
            UCurveTable* AttribTable = Cast<UCurveTable>(AttribDefaultTableName.TryLoad());
            if (AttribTable)
            {
                UE_LOG(LogTemp, Warning, TEXT("INIT: AttribTable found."))       // NOT triggering
                GlobalAttributeDefaultsTables.Add(AttribTable);
                bLoadedAnyDefaults = true;
            }
            else { UE_LOG(LogTemp, Warning, TEXT("INIT: Valid table failed to provide valid AttribTable...")) }    // currently triggers, twice
        }
    }

    if (bLoadedAnyDefaults)
    {
        UE_LOG(LogTemp, Warning, TEXT("INIT: Loaded any defaults!"))
        // Subscribe for reimports if in the editor
#if WITH_EDITOR
        if (GIsEditor && !RegisteredReimportCallback)
        {
//            GEditor->OnObjectReimported().AddUObject(this, &UAbilitySystemGlobals::OnTableReimported);
            RegisteredReimportCallback = true;
        }
#endif


        ReloadAttributeDefaults();
    }
    else { UE_LOG(LogTemp, Warning, TEXT("INIT: Failed to load any defaults...")) }            // currently triggers
}

With that code, when I turn on Play in Editor, my log has these messages:
LogTemp: Warning: INIT: Starting loop for global curve tables…
LogTemp: Warning: INIT: Found valid table!
LogTemp: Warning: INIT: Valid table failed to provide valid AttribTable…
LogTemp: Warning: INIT: Starting loop for global curve tables…
LogTemp: Warning: INIT: Found valid table!
LogTemp: Warning: INIT: Valid table failed to provide valid AttribTable…
LogTemp: Warning: INIT: Failed to load any defaults…

I am really stuck here. I’ve never messed around with loading files from ini files in UE4 before, and I’m really out of ideas on how to troubleshoot this and fix it.

Any help would be immensely appreciated!!

One more update…

So, I solved that problem - the config path I needed was “/Game/GameplayAbilities/AttributeTables/BAS_Curve.BAS_Curve”
Also opened up my table so instead of being Default.Health it is Default.Basic.Health (comes from the attribute set AttributeSet_Basic)

And I’ve messed around a thousand times more, and finally there is just one thing that is stopping me. It’s got the data on the table and everything, but for some reason I have a compile error with the AllocAttributeSetInitter() override.


void UMLSTAbilitySystemGlobals::AllocAttributeSetInitter()
{
    Super::AllocAttributeSetInitter();
}

If I do the Super, everything works, but there’s a problem because it’s using the AttributeSetInitterDiscreteLevels instead of my own init setter. (Which is the whole point to change!)

However, when I take the exact same code from the AbiltySystemGlobals.cpp and put it into mine, as follows:


{
    GlobalAttributeSetInitter = TSharedPtr<FAttributeSetInitter>(new FAttributeSetInitterDiscreteLevels FAttributeSetInitterDiscreteLevels());
}

I get a compile error:
D:\Unreal Projects\Maelstrom\Source\Zangies\Abilities\MLSTAbilitySystemGlobals.cpp(342) : error C2146: syntax error: missing ‘)’ before identifier ‘FAttributeSetInitterDiscreteLevels’
D:\Unreal Projects\Maelstrom\Source\Zangies\Abilities\MLSTAbilitySystemGlobals.cpp(342) : error C2146: syntax error: missing ‘;’ before identifier ‘FAttributeSetInitterDiscreteLevels’
D:\Unreal Projects\Maelstrom\Source\Zangies\Abilities\MLSTAbilitySystemGlobals.cpp(342) : error C2059: syntax error: ‘)’

This happens whether I’m using the regular DiscreteLevels one, or my own initter. I have worked to have every possible #include I can, I’ve tried forward declaring them ((new struct FAttributeSet…) but it doesn’t change it.

If I change it to this:


{
    GlobalAttributeSetInitter = TSharedPtr<FAttributeSetInitter>(new FAttributeSetInitterDiscreteLevels());
}

it compiles, but then the game crashes instantly on play.

Literally every inch of this process has not worked as it’s supposed to. >.<

Hey @SamPanda - I somehow managed to miss these posts. I know this is quite late but if you haven’t solved your issue yet let me know and we can take a look at it. In the meantime I’ve figured I’d share another useful function that we’ve been using internally on our project. The context is as follows:

  1. We have stat bars on our HUD that represent our player attributes
  2. We want these to interpolate smoothly
  3. This is difficult because there can be any number of gameplay effects modifying the attribute in question (I’ll use stamina as an example)
  4. If Stamina recovery ticks every 2s and gives the player 20 stamina, we need to interpolate at a speed of 10 stamina per second in order to stay in sync with the actual stamina value
  5. This is problematic of course because there can be a debuff that drains stamina or anything really that modifies the attribute

In order to solve this problem we’ve implemented a function to figure out when the next stat update is going to occur and make sure our stat bar value interpolates at a speed that will ensure it matches the real attribute value at the time when the next update hits. The function to get the next update time of an attribute looks like this:



float UMyAbilitySystemComponent::K2_GetNextAttributeChangeTime(const FGameplayAttribute Attribute) const
{
    return GetNextAttributeChangeTime(Attribute);
}

float UMyAbilitySystemComponent::GetNextAttributeChangeTime(const FGameplayAttribute& Attribute) const
{
    float NextPeriod, Duration;
    FGameplayEffectQuery Query;
    Query.ModifyingAttribute = Attribute;

    if (GetActiveEffectsNextPeriodAndDuration(Query, NextPeriod, Duration))
    {
        return NextPeriod;
    }

    return -1.f;
}

bool UMyAbilitySystemComponent::GetActiveEffectsNextPeriodAndDuration(const FGameplayEffectQuery& Query, 
    float& NextPeriod, float& Duration) const
{
    const TArray<FActiveGameplayEffectHandle> ActiveEffects = GetActiveEffects(Query);

    bool bFoundSomething = false;
    float MinPeriod = TNumericLimits<float>::Max();
    float MaxEndTime = -1.f;

    UWorld* World = GetWorld();
    if (World == nullptr)
    {
        return false;
    }

    FTimerManager& WTM = World->GetTimerManager();

    for (const FActiveGameplayEffectHandle& Handle : ActiveEffects)
    {
        const FActiveGameplayEffect& Effect = *ActiveGameplayEffects.GetActiveGameplayEffect(Handle);
        if (!Query.Matches(Effect))
        {
            continue;
        }

        float ThisEndTime = Effect.GetEndTime();

        float ThisPeriod = WTM.GetTimerRemaining(Effect.PeriodHandle);
        if (ThisPeriod <= UGameplayEffect::INFINITE_DURATION)
        {
            // This effect has no period, check how long it has remaining
            float ThisTimeRemaining = Effect.GetTimeRemaining(World->GetTimeSeconds());
            if (ThisTimeRemaining <= UGameplayEffect::INFINITE_DURATION)
            {
                //It's neither period nor has a duration, not interested.
                continue;
            }

            bFoundSomething = true;
            MinPeriod = FMath::Min(ThisTimeRemaining, MinPeriod);
        }
        else
        {
            bFoundSomething = true;
            MinPeriod = FMath::Min(ThisPeriod, MinPeriod);
        }

        if (ThisEndTime > MaxEndTime)
        {
            MaxEndTime = ThisEndTime;
            Duration = Effect.GetDuration();
        }

    }

    NextPeriod = MinPeriod;

    return bFoundSomething;
}



Hope this helps someone!

Thanks for this guide, very helpful. I recently inherited a code base using this GameplayAbilities system so your analysis has been critical to my understanding. I recently updated to 4.22 and have run into an issue because the CurveTable API was updated to include FRealCurve and FSimpleCurve in addition to FRichCurve.

The engine is failing this assertion:


const TMap<FName, FRichCurve*>& GetRichCurveRowMap() const { check(CurveTableMode != ECurveTableMode::SimpleCurves); return *reinterpret_cast<const TMap<FName, FRichCurve*>*>(&RowMap); }

It seems my CurveTable assets generated from .csv are of ECurveTableMode Simple Curves, so the assertion fails. The game crashes in PIE because GetRichCurveRowMap() is being called somewhere in the engine code for GameplayAbilities. Any ideas how I might fix this?

Hey @kigbariom - glad that this helped. Yes, with the 4.22 update they have changed the API for querying curves. I just went back to their FAttributeSetInitterDiscreteLevels and airlifted the code from there. The updated code is below. I am going on memory here so if I miss some line that changed just take a look at the default initter or ask here.


//Iterating over all the curves and getting the row names
for (const TPair<FName, FRealCurve*>& CurveRow : CurTable->GetRowMap())
{
            FString RowName = CurveRow.Key.ToString();
            ...

           //Getting the last level, i.e. the maximum X axis value of the curve
           int32 LastLevel = Curve->GetKeyTime(Curve->GetLastKeyHandle());

            //Iterating over the keys of the curve
            for (auto KeyIter = Curve->GetKeyHandleIterator(); KeyIter; ++KeyIter)
            {
                const FKeyHandle& KeyHandle = *KeyIter;
                TPair<float, float> LevelValuePair = Curve->GetKeyTimeValuePair(KeyHandle);
                int32 Level = LevelValuePair.Key;
                float Value = LevelValuePair.Value;
                ...
            }
}


​​​​​​​I think those are all of the changes they made.

Hello @DamirH !
First of all, thanks a lot for your great posts, I will probably use GameplayAbilities soon in my project. I had a very important question before that though and you seem to be the best person to answer it according to your replies in the original thread : is this system suitable for BTA games like Bayonetta/Devil May Cry and the likes ?

I planned to do a combo system which heavily relies on Animation Notifications, kinda like what Platinum Games did for Nier Automata, using some flags and such, and I was wondering if this approach was compatible with GameplayAbilities.

Well, consider we’re doing something not unlike that with our own game, yes it can be adapted to this. The system is very generic and you will have to define A LOT of game specific stuff, but it can work.

Thanks a lot ! Good luck for your project :slight_smile:

Our project requires each FGameplayAttributeData within an AttributeSet to update its current value based on a point within a CurveFloat asset. This is so we can have custom curves for every attribute for characters, the inTime for the CurveFloat lookup will be the characters “level”.

It seems we need to create our own version of FGameplayAttributeData that returns CurrentValue based on the lookup in a float curve defined on the character, likely in a datatable, and possibly create our own FAttributeSetInitter as well for the default value lookup. Has anyone attempted something like this before, or have any advice? I’m a bit concerned with the performance for this, since the “level” will be constantly changing, and I noticed the CurveTable is preloaded in the default system.