Download

Comprehensive GameplayAbilities Analysis Series

Ah I see, so a plain old “it’s just the newer, tidier implementation”. Gotcha. You actually can get both regular value and base value of a float attribute relatively easily if the helper functions you can call within blueprints are any indication, but I actually don’t know if that is particularly expensive. I’d have to check.

Either way, as I said, it shouldn’t really make that big of a difference one way or the other, switching from float to structs should be no big rewrite. UE4’s macros thankfully have the courtesy to work pretty much identically on single variables and structs alike for most purposes such as replication and general UPROPERTY tags.

I guess I might change/mention that in my guide as well then.

Well honestly, regardless of the real performance overhead, having to “undo” a bunch of GEs to get the base value is always more error-prone than to just have it lying around there at O(1) access time and at the cost of a few bits more. :stuck_out_tongue: Tomorrow I’ll probably be expanding this thread with a few new findings regarding “non-hardcoded” attributes that I have to deal with (and use it as a nice segue into GEs).

So I am trying something new, I have a few sets of Attributes that behave in similar ways. For our purposes let’s talk about Health and Energy. I would like to reuse the common functionality, and I tried to do so by inheriting from a common parent. When I did this via inheritance, the editor could only see the values in the parent class, and not the child classes. And since, as far as I can tell, you cannot use multiple instances of an attribute set in a class, I don’t know how to do it. I have instead built one class and then copied all the code over and over, and replace the names.

Is there another way to do this?

Regards,

giffen

Hi all,
when I use config set Attribute default value, the error like:

		#if !(UE_BUILD_SHIPPING || UE_BUILD_TEST) 						// This can only work right now on POD attribute sets. If we ever support FStrings or TArrays in AttributeSets 						// we will need to update this code to not use memcpy etc. 						for (TFieldIterator<UProperty> PropIt(TestClass, EFieldIteratorFlags::IncludeSuper); PropIt; ++PropIt) 						{ 							if (!PropIt->HasAllPropertyFlags(CPF_IsPlainOldData)) 							{ 								ABILITY_LOG(Error, TEXT("FAttributeSetInitterDiscreteLevels::PreloadAttributeSetData Unable to Handle AttributeClass %s because it has a non POD property: %s"), 									*TestClass->GetName(), *PropIt->GetName()); 								return; 							} 						} 			#endif        but i check my curveTable struct, no error, and if i delete the check(#if !(UE_BUILD_SHIPPING || UE_BUILD_TEST)), build sucessful. so what issue?

Hey,

The issue is that the default FAttributeSetInitter relies on your attribute sets to use floats instead of FAttributes. You will have to implement your own initter and allocate that instead of the default one in your AbilitySystemGlobals.

Not that I know of but I don’t think these were ever meant to be inherited. What is your common stuff? If it’s only functions then you could delegate those in some function library.

I also wanted to take this chance to post a small(ish) update to this thread. By now everyone interested in GA has seen the ActionRPG sample and since my own needs have evolved over time I’ve adopted a similar system for applying entire sets of GameplayEffects via a TMap (just how it’s done in the ActionRPG sample). However I’ve run into a snag and the solution to that snag involves some digging around so here’s what I found. This assumes that you’ve dug through the ActionRPG sample as I’ll be referencing code from there.

The problem: Even though my Gameplay Effects have a gameplay cue defined, and it works great when I hit enemies with said gameplay effects… it doesn’t help me if I hit something that doesn’t have a UAbilitySystemComponent, i.e. a wall, a rock or a destructible mesh.

The solution (short version): If something is hit that is not an enemy with a ASC (Ability System Component), play the hit cue manually on the OWNING ASC instead of the target.

The solution (long version):

The problem essentially stems from the fact that when applying a GE to a target, it will just short out and fail if that target doesn’t have a ASC on it. The first step of the solution is to cache away those actors. I did that in my gameplay ability, where I (and the ActionRPG sample) build up the target data:


 if (Container.TargetType.Get() != nullptr)
        {
            TArray<FHitResult> HitResults;
            TArray<AActor*> TargetActors;
            const UWaTargetType* TargetTypeCDO = Container.TargetType.GetDefaultObject();
            TargetTypeCDO->GetTargets(OwningWaCharacter, EventData, HitResults, TargetActors);
            ReturnSpec.AddTargets(HitResults, TargetActors);

            //This is new
            for (const FHitResult& Result : HitResults)
            {
                if (UAbilitySystemGlobals::GetAbilitySystemComponentFromActor(Result.GetActor()) == nullptr)
                {
                    NonAbilityTargets.Emplace(FNonAbilityTarget(CueTags, Result));
                }
            }

            for (AActor* Act : TargetActors)
            {
                if (UAbilitySystemGlobals::GetAbilitySystemComponentFromActor(Act) == nullptr)
                {
                    NonAbilityTargets.Emplace(FNonAbilityTarget(CueTags, Act));
                }
            }
            //New ends here


        }

Basically I am taking any actors I have, checking if they have a ASC and if not, storing them into my own NonAbilityTargets array. This looks like this:


//A struct for temporary holding of actors (and transforms) of actors that we hit
//that don't have an ASC. Used for environment impact GameplayCues.
struct FNonAbilityTarget
{
    FGameplayTagContainer CueContainer;
    TWeakObjectPtr<AActor> TargetActor;
    FHitResult TargetHitResult;
    bool bHasHitResult;

public:
    FNonAbilityTarget()
        : CueContainer(FGameplayTagContainer())
        , TargetActor(nullptr)
        , TargetHitResult(FHitResult(ENoInit::NoInit))
        , bHasHitResult(false)
    {
    }

    FNonAbilityTarget(const FGameplayTagContainer& InCueTags, const FHitResult& InResult)
        : CueContainer(InCueTags)
        , TargetActor(TWeakObjectPtr<AActor>(InResult.GetActor()))
        , TargetHitResult(InResult)
        , bHasHitResult(true)
    {
    }

    FNonAbilityTarget(const FGameplayTagContainer& InCueTags, AActor* InActor)
        : CueContainer(InCueTags)
        , TargetActor(TWeakObjectPtr<AActor>(InActor))
        , TargetHitResult(FHitResult(ENoInit::NoInit))
        , bHasHitResult(false)
    {
    }
};

//Note the inline allocator. This will put 1 element on the stack and the rest (if any) on the heap. Since you'll be hitting a single target 90% of the time this is a big performance win
TArray<FNonAbilityTarget, TInlineAllocator<1>> NonAbilityTargets;

Last but not least, where the gameplay effects actually get applied (the ApplyEffectContainerSpec function in the ActionRPG sample), I have this additional code:


check(CurrentActorInfo);
    for (const FNonAbilityTarget& NAT : NonAbilityTargets)
    {
        FGameplayCueParameters GCParams;
        UMyAbilitySystemGlobals& WASG = static_cast<UMyAbilitySystemGlobals&>(UMyAbilitySystemGlobals::Get());


        if (NAT.bHasHitResult)
        {
            WASG.InitGameplayCueParameters_HitResult(GCParams, this, NAT.TargetHitResult);

        }
        else
        {
            WASG.InitGameplayCueParameters_Actor(GCParams, this, NAT.TargetActor.Get());
        }

        for (auto It = NAT.CueContainer.CreateConstIterator(); It; ++It)
        {
            const FGameplayTag& Tag = *It;
            GCParams.OriginalTag = Tag;
            CurrentActorInfo->AbilitySystemComponent->ExecuteGameplayCue(Tag, GCParams);
        }

    }

This basically just creates a FGameplayCueParameters struct, populates all the required info and fires off ExecuteGameplayCue to the owner ASC, since the target doesn’t have one. Now the magic happens in the population of the FGameplayCueParameters struct. If we look at the DEFAULT UAbilitySystemGlobals we find this:


    /** Initialize GameplayCue Parameters */
    virtual void InitGameplayCueParameters(FGameplayCueParameters& CueParameters, const FGameplayEffectSpecForRPC &Spec);
    virtual void InitGameplayCueParameters_GESpec(FGameplayCueParameters& CueParameters, const FGameplayEffectSpec &Spec);
    virtual void InitGameplayCueParameters(FGameplayCueParameters& CueParameters, const FGameplayEffectContextHandle& EffectContext);

So this is a design paradigm already present in the system. Their own default implementation just takes variables from the function parameters and puts them into the outgoing CueParameters struct, like so:


void UAbilitySystemGlobals::InitGameplayCueParameters(FGameplayCueParameters& CueParameters, const FGameplayEffectContextHandle& EffectContext)
{
    if (EffectContext.IsValid())
    {
        // Copy Context over wholesale. Projects may want to override this and not copy over all data
        CueParameters.EffectContext = EffectContext;
    }
}

Having seen this it was a simple matter of creating my own variations that extract data from actors, hit results etc.


virtual void InitGameplayCueParameters_Transform(FGameplayCueParameters& CueParameters, UGameplayAbility* Ability,  const FTransform& DestinationTransform);
virtual void InitGameplayCueParameters_HitResult(FGameplayCueParameters& CueParameters, UGameplayAbility* Ability, const FHitResult& HitResult);
virtual void InitGameplayCueParameters_Actor(FGameplayCueParameters& CueParameters, UGameplayAbility* Ability, const AActor* InTargetActor);


That’s all there is to it… almost. Some special attention needs to be placed on the HitResult and Actor versions of said functions (Relevant comments in the code):


void UMyAbilitySystemGlobals::InitGameplayCueParameters_HitResult(FGameplayCueParameters& CueParameters,
    UGameplayAbility* Ability, const FHitResult& HitResult)
{
    if (Ability == nullptr)
    {
        return;
    }

    FGameplayAbilityActorInfo CurrentActorInfo = Ability->GetActorInfo();
    check(CurrentActorInfo.AbilitySystemComponent.IsValid());

    CueParameters.AbilityLevel = Ability->GetAbilityLevel();
    CueParameters.EffectCauser = CurrentActorInfo.AvatarActor;
    CueParameters.EffectContext = CurrentActorInfo.AbilitySystemComponent->MakeEffectContext();
    CueParameters.Instigator = CurrentActorInfo.OwnerActor;
    CueParameters.SourceObject = Ability;

    //My gameplay cues, namely hit impacts, depend on the location and hit normal. In this case, we just extract this stuff from the hit result, as seen below.
    CueParameters.Location = HitResult.Location;
    CueParameters.Normal = HitResult.ImpactNormal;
    CueParameters.TargetAttachComponent = HitResult.GetComponent();
    CueParameters.PhysicalMaterial = HitResult.PhysMaterial;

    //Important: Even though this does not come from a GameplayEffect, with create a context a couple lines above and here we add the hit result 
    //to the context, mainly because gameplay cue notifies look for the hit result inside the effect context to know where they should spawn particles etc.
    CueParameters.EffectContext.AddHitResult(HitResult);
}

void UWaAbilitySystemGlobals::InitGameplayCueParameters_Actor(FGameplayCueParameters& CueParameters, 
    UGameplayAbility* Ability, const AActor* InTargetActor)
{
    if (Ability == nullptr)
    {
        return;
    }

    FGameplayAbilityActorInfo CurrentActorInfo = Ability->GetActorInfo();
    check(CurrentActorInfo.AbilitySystemComponent.IsValid());

    CueParameters.AbilityLevel = Ability->GetAbilityLevel();
    CueParameters.EffectCauser = CurrentActorInfo.AvatarActor;
    CueParameters.EffectContext = CurrentActorInfo.AbilitySystemComponent->MakeEffectContext();
    CueParameters.Instigator = CurrentActorInfo.OwnerActor;
    CueParameters.SourceObject = Ability;

     //Since we don't have a hit result in the actor version, we do our best estimates by...

    //...calculating the location and direction with simple center-to-center math...
    CueParameters.Location = InTargetActor->GetActorLocation();
    CueParameters.Normal =
        (CurrentActorInfo.AvatarActor->GetActorLocation() - CueParameters.Location).GetSafeNormal();

    //...use the target's root as an attachment point
    CueParameters.TargetAttachComponent = InTargetActor->GetRootComponent();

    //...and if the root is an actual primitive (it will be in most cases unless you FUBAR'd something)...
    if (UPrimitiveComponent* TargetPrimitive = Cast<UPrimitiveComponent>(CueParameters.TargetAttachComponent))
    {
        //...just get the first material off of it...
        UMaterialInterface* MInt = (TargetPrimitive->GetNumMaterials() > 0) ?
            TargetPrimitive->GetMaterial(0) : nullptr;

        //...and get its physical material...
        if (MInt != nullptr)
        {
            CueParameters.PhysicalMaterial = TWeakObjectPtr<UPhysicalMaterial>(MInt->GetPhysicalMaterial());
        }
    }

    //...unless there are no materials or physical materials defined, in which case just get the engine default.
    if (!CueParameters.PhysicalMaterial.IsValid())
    {
        CueParameters.PhysicalMaterial = GEngine->DefaultPhysMaterial;
    }
}

Hope this helps someone as it one area where I feel you kinda have to fight the GA system to get proper cues on “plain old objects”.

Hi, if you could please help me I have been having a truly awful time trying to implement this attributes system to use the defaults that I want it to, I’ve spent days on this tearing my hair out trying different tutorials and at this point I can’t take it anymore.

I am not trying to do anything insane, I literally just want to initialize a base set of values for my attributes. Not one that varies by level or curve or any of that. I just want to set MoveSpeed to be 1, TimeScale to be 1, etc for ALL characters. I already set up a GameplayEffect to set specific values like Max Health based on values on my character blueprints. I tried to use a DataTable before but that wasn’t working so I figured I’d do it the way they want and use a CurveTable, but that also isn’t working.

The problem is that my game has a very very large amount of attributes, and I don’t want to set all of them on a GameplayEffect because it could load a hundred attributes for every monster, have to replicate those, etc. and I’m not super familiar with how costly all of that is. I figured that it would be infinitely cheaper to have them just use a basic defaults set that would all get done as a batch initially and not have to be replicated/other issues.

What’s happened:
I’ve followed your tutorial as well as I can, though I am not changing the curve type for the table. I’ve extended the GameInstance, the AbilitySystemGlobals, the FAttributeSetInitter, all of it. I modified virtually no functionality from the FAttributeSetInitterDiscreteLevels, though I still set it up so that I could change it because I read what you wrote that their version is deprecated and only works with the old float values system.

I’m stuck right now. Two issues.
1 - When you said “The issue is that the default FAttributeSetInitter relies on your attribute sets to use floats instead of FAttributes. You will have to implement your own initter and allocate that instead of the default one in your AbilitySystemGlobals.” ---- I have implemented my own initter and allocated it. But I don’t know what code I should change to fix this problem? Do I just remove the POD check, or will that cause an issue down the line? I am not advanced with C++ and really can’t tell just from looking at it why that check is even there, so I am not sure what it affects.
If you could please just let me know specifically how I could get around that problem, or if it is even something I need to worry about, I would greatly appreciate it.

2 - I am currently, after following through with all of the changes to .ini files and extensions of classes and everything, able to compile in VS. However, when I click to compile in UE4, it gives me an error. I have attached a picture of this error, but below is the longest part:

MLSTAbilitySystemGlobals.cpp.obj : error LNK2001: unresolved external symbol “struct FThreadSafeStaticStat<struct FStat_STAT_InitAttributeSetDefaults> StatPtr_STAT_InitAttributeSetDefaults” (?StatPtr_STAT_InitAttributeSetDefaults@@3U?$FThreadSafeStaticStat@UFStat_STAT_InitAttributeSetDefaults@@@@A)

I really have no idea where this is coming from. Like I said, no error in VS. I’ve got all of the includes and everything. This is my class that extends the AbilitySystemGlobals, and it’s where I’ve copy-pasted (and renamed) all of the functions from FAttributeSetInitterDiscreteLevels. I’ve attached all of the code in the file below, but there’s no real need to peruse it because like I said I only changed the names. The header file is also super simple and just overrides.I tried deleting my binaries folder but couldn’t rebuild my project after that (even though I could get it to compile in VS) so I restored it, and now I’m at a major loss as to what to do.

Any advice or anything on this to help? I would really appreciate it. I appreciate so much the work that you have already done with this tutorial.



#include "MLSTAbilitySystemGlobals.h"
#include "AttributeSet.h"
#include "Stats/StatsMisc.h"
#include "EngineDefines.h"
#include "Engine/Blueprint.h"
#include "AssetData.h"
#include "Engine/ObjectLibrary.h"
#include "VisualLogger/VisualLogger.h"
#include "AbilitySystemLog.h"
#include "GameplayEffectAggregator.h"
#include "AbilitySystemStats.h"
#include "UObject/UObjectHash.h"
#include "UObject/UObjectIterator.h"
#include "AbilitySystemGlobals.h"
#include "AbilitySystemComponent.h"
#include "AbilitySystemTestAttributeSet.h"




/** Initialize FAttributeSetInitter. This is virtual so projects can override what class they use */
void UMLSTAbilitySystemGlobals::AllocAttributeSetInitter()
{
    GlobalAttributeSetInitter = TSharedPtr<FAttributeSetInitter>(new FAttributeSetInitterBasic());
}











// FAttributeSetInitter Implementation

// ------------------------------------------------------------------------------------
//
// ------------------------------------------------------------------------------------
TSubclassOf<UAttributeSet> FAttributeSetInitterBasic::FindBestAttributeClassBasic(TArray<TSubclassOf<UAttributeSet> >& ClassList, FString PartialName)
{
    for (auto Class : ClassList)
    {
        if (Class->GetName().Contains(PartialName))
        {
            return Class;
        }
    }

    return nullptr;
}



void FAttributeSetInitterBasic::PreloadAttributeSetData(const TArray<UCurveTable*>& CurveData)
{
    if (!ensure(CurveData.Num() > 0))
    {
        return;
    }

    /**
     *    Get list of AttributeSet classes loaded
     */

    TArray<TSubclassOf<UAttributeSet> >    ClassList;
    for (TObjectIterator<UClass> ClassIt; ClassIt; ++ClassIt)
    {
        UClass* TestClass = *ClassIt;
        if (TestClass->IsChildOf(UAttributeSet::StaticClass()))
        {
            ClassList.Add(TestClass);
#if !(UE_BUILD_SHIPPING || UE_BUILD_TEST)
            // This can only work right now on POD attribute sets. If we ever support FStrings or TArrays in AttributeSets
            // we will need to update this code to not use memcpy etc.
            for (TFieldIterator<UProperty> PropIt(TestClass, EFieldIteratorFlags::IncludeSuper); PropIt; ++PropIt)
            {
                if (!PropIt->HasAllPropertyFlags(CPF_IsPlainOldData))
                {
                    ABILITY_LOG(Error, TEXT("FAttributeSetInitterBasic::PreloadAttributeSetData Unable to Handle AttributeClass %s because it has a non POD property: %s"),
                        *TestClass->GetName(), *PropIt->GetName());
                    return;
                }
            }
#endif
        }
    }

    /**
     *    Loop through CurveData table and build sets of Defaults that keyed off of Name + Level
     */
    for (const UCurveTable* CurTable : CurveData)
    {
        for (auto It = CurTable->RowMap.CreateConstIterator(); It; ++It)
        {
            FString RowName = It.Key().ToString();
            FString ClassName;
            FString SetName;
            FString AttributeName;
            FString Temp;

            RowName.Split(TEXT("."), &ClassName, &Temp);
            Temp.Split(TEXT("."), &SetName, &AttributeName);

            if (!ensure(!ClassName.IsEmpty() && !SetName.IsEmpty() && !AttributeName.IsEmpty()))
            {
                ABILITY_LOG(Verbose, TEXT("FAttributeSetInitterBasic::PreloadAttributeSetData Unable to parse row %s in %s"), *RowName, *CurTable->GetName());
                continue;
            }

            // Find the AttributeSet

            TSubclassOf<UAttributeSet> Set = FindBestAttributeClassBasic(ClassList, SetName);
            if (!Set)
            {
                // This is ok, we may have rows in here that don't correspond directly to attributes
                ABILITY_LOG(Verbose, TEXT("FAttributeSetInitterBasic::PreloadAttributeSetData Unable to match AttributeSet from %s (row: %s)"), *SetName, *RowName);
                continue;
            }

            // Find the UProperty
            UProperty* Property = FindField<UProperty>(*Set, *AttributeName);
            if (!IsSupportedProperty(Property))
            {
                ABILITY_LOG(Verbose, TEXT("FAttributeSetInitterBasic::PreloadAttributeSetData Unable to match Attribute from %s (row: %s)"), *AttributeName, *RowName);
                continue;
            }

            FRichCurve* Curve = It.Value();
            FName ClassFName = FName(*ClassName);
            FAttributeSetDefaultsCollection& DefaultCollection = Defaults.FindOrAdd(ClassFName);

            int32 LastLevel = Curve->GetLastKey().Time;
            DefaultCollection.LevelData.SetNum(FMath::Max(LastLevel, DefaultCollection.LevelData.Num()));

            //At this point we know the Name of this "class"/"group", the AttributeSet, and the Property Name. Now loop through the values on the curve to get the attribute default value at each level.
            for (auto KeyIter = Curve->GetKeyIterator(); KeyIter; ++KeyIter)
            {
                const FRichCurveKey& CurveKey = *KeyIter;

                int32 Level = CurveKey.Time;
                float Value = CurveKey.Value;

                FAttributeSetDefaults& SetDefaults = DefaultCollection.LevelData[Level - 1];

                FAttributeDefaultValueList* DefaultDataList = SetDefaults.DataMap.Find(Set);
                if (DefaultDataList == nullptr)
                {
                    ABILITY_LOG(Verbose, TEXT("Initializing new default set for %s%d]. PropertySize: %d.. DefaultSize: %d"), *Set->GetName(), Level, Set->GetPropertiesSize(), UAttributeSet::StaticClass()->GetPropertiesSize());

                    DefaultDataList = &SetDefaults.DataMap.Add(Set);
                }

                // Import curve value into default data

                check(DefaultDataList);
                DefaultDataList->AddPair(Property, Value);
            }
        }
    }
}

void FAttributeSetInitterBasic::InitAttributeSetDefaults(UAbilitySystemComponent* AbilitySystemComponent, FName GroupName, int32 Level, bool bInitialInit) const
{
    SCOPE_CYCLE_COUNTER(STAT_InitAttributeSetDefaults);
    check(AbilitySystemComponent != nullptr);

    const FAttributeSetDefaultsCollection* Collection = Defaults.Find(GroupName);
    if (!Collection)
    {
        ABILITY_LOG(Warning, TEXT("Unable to find DefaultAttributeSet Group %s. Failing back to Defaults"), *GroupName.ToString());
        Collection = Defaults.Find(FName(TEXT("Default")));
        if (!Collection)
        {
            ABILITY_LOG(Error, TEXT("FAttributeSetInitterBasic::InitAttributeSetDefaults Default DefaultAttributeSet not found! Skipping Initialization"));
            return;
        }
    }

    if (!Collection->LevelData.IsValidIndex(Level - 1))
    {
        // We could eventually extrapolate values outside of the max defined levels
        ABILITY_LOG(Warning, TEXT("Attribute defaults for Level %d are not defined! Skipping"), Level);
        return;
    }

    const FAttributeSetDefaults& SetDefaults = Collection->LevelData[Level - 1];
    for (const UAttributeSet* Set : AbilitySystemComponent->SpawnedAttributes)
    {
        const FAttributeDefaultValueList* DefaultDataList = SetDefaults.DataMap.Find(Set->GetClass());
        if (DefaultDataList)
        {
            ABILITY_LOG(Log, TEXT("Initializing Set %s"), *Set->GetName());

            for (auto& DataPair : DefaultDataList->List)
            {
                check(DataPair.Property);

                if (Set->ShouldInitProperty(bInitialInit, DataPair.Property))
                {
                    FGameplayAttribute AttributeToModify(DataPair.Property);
                    AbilitySystemComponent->SetNumericAttributeBase(AttributeToModify, DataPair.Value);
                }
            }
        }
    }

    AbilitySystemComponent->ForceReplication();
}

void FAttributeSetInitterBasic::ApplyAttributeDefault(UAbilitySystemComponent* AbilitySystemComponent, FGameplayAttribute& InAttribute, FName GroupName, int32 Level) const
{
    SCOPE_CYCLE_COUNTER(STAT_InitAttributeSetDefaults);

    const FAttributeSetDefaultsCollection* Collection = Defaults.Find(GroupName);
    if (!Collection)
    {
        ABILITY_LOG(Warning, TEXT("Unable to find DefaultAttributeSet Group %s. Failing back to Defaults"), *GroupName.ToString());
        Collection = Defaults.Find(FName(TEXT("Default")));
        if (!Collection)
        {
            ABILITY_LOG(Error, TEXT("FAttributeSetInitterBasic::InitAttributeSetDefaults Default DefaultAttributeSet not found! Skipping Initialization"));
            return;
        }
    }

    if (!Collection->LevelData.IsValidIndex(Level - 1))
    {
        // We could eventually extrapolate values outside of the max defined levels
        ABILITY_LOG(Warning, TEXT("Attribute defaults for Level %d are not defined! Skipping"), Level);
        return;
    }

    const FAttributeSetDefaults& SetDefaults = Collection->LevelData[Level - 1];
    for (const UAttributeSet* Set : AbilitySystemComponent->SpawnedAttributes)
    {
        const FAttributeDefaultValueList* DefaultDataList = SetDefaults.DataMap.Find(Set->GetClass());
        if (DefaultDataList)
        {
            ABILITY_LOG(Log, TEXT("Initializing Set %s"), *Set->GetName());

            for (auto& DataPair : DefaultDataList->List)
            {
                check(DataPair.Property);

                if (DataPair.Property == InAttribute.GetUProperty())
                {
                    FGameplayAttribute AttributeToModify(DataPair.Property);
                    AbilitySystemComponent->SetNumericAttributeBase(AttributeToModify, DataPair.Value);
                }
            }
        }
    }

    AbilitySystemComponent->ForceReplication();
}

TArray<float> FAttributeSetInitterBasic::GetAttributeSetValues(UClass* AttributeSetClass, UProperty* AttributeProperty, FName GroupName) const
{
    TArray<float> AttributeSetValues;
    const FAttributeSetDefaultsCollection* Collection = Defaults.Find(GroupName);
    if (!Collection)
    {
        ABILITY_LOG(Error, TEXT("FAttributeSetInitterBasic::InitAttributeSetDefaults Default DefaultAttributeSet not found! Skipping Initialization"));
        return TArray<float>();
    }

    for (const FAttributeSetDefaults& SetDefaults : Collection->LevelData)
    {
        const FAttributeDefaultValueList* DefaultDataList = SetDefaults.DataMap.Find(AttributeSetClass);
        if (DefaultDataList)
        {
            for (auto& DataPair : DefaultDataList->List)
            {
                check(DataPair.Property);
                if (DataPair.Property == AttributeProperty)
                {
                    AttributeSetValues.Add(DataPair.Value);
                }
            }
        }
    }
    return AttributeSetValues;
}


bool FAttributeSetInitterBasic::IsSupportedProperty(UProperty* Property) const
{
    return (Property && (Cast<UNumericProperty>(Property) || FGameplayAttribute::IsGameplayAttributeDataProperty(Property)));
}

Hi, follow-up to my last post.

I figured out my prior problem by removing the “SCOPE_CYCLE_COUNTER” line from all functions in my subclass of the AbilitySystemGlobals.

Now there have been countless additional problems which I’ve had to work out one by one to try and isolate a specific problem:

My Curve Tables are not loading. (These are just discrete level curve tables, nothing exciting or strange about them). I have detected this by overriding my InitAttributeDefaults() function and filling it with logs at different parts, to see where the problem is.

My DefaultGame.ini has this section:



[/Script/GameplayAbilities.AbilitySystemGlobals]
GlobalAttributeSetDefaultsTableNames=/Maelstrom/Content/GameplayAbilities/AttributeTables/BAS_Curve.BAS_Curve
+GlobalAttributeSetDefaultsTableNames=/Maelstrom/Content/GameplayAbilities/AttributeTables/TestCurve.TestCurve
+AbilitySystemGlobalsClassName=/Script/Zangies.MLSTAbilitySystemGlobals

I also tried it before with “/Game/” instead of “/Maelstrom/” and there was no change.

My logs show that the two paths here are found, but that they fail to load anything meaningful from them. This is an example of one of my curve tables:
[TABLE=“border: 1, cellpadding: 0, cellspacing: 0”]


		1

Default.Health
100

Default.HealthMax
100

Default.HealthRegen
0.01

Default.MoveSpeed
1

Default.StealthLevel
1

Default.Timescale
1



    // Handle array of global curve tables for attribute defaults
    for (const FSoftObjectPath& AttribDefaultTableName : GlobalAttributeSetDefaultsTableNames)
    {
        UE_LOG(LogTemp, Warning, TEXT("INIT: Starting loop for global curve tables.."))        // currently triggers, twice
        if (AttribDefaultTableName.IsValid())
        {
            UE_LOG(LogTemp, Warning, TEXT("INIT: Found valid table!"))        // currently triggers, twice
            if (AttribDefaultTableName.TryLoad())
            {
                UE_LOG(LogTemp, Warning, TEXT("INIT: Loaded something valid."))        // NOT triggering

            }
            UCurveTable* AttribTable = Cast<UCurveTable>(AttribDefaultTableName.TryLoad());
            if (AttribTable)
            {
                UE_LOG(LogTemp, Warning, TEXT("INIT: AttribTable found."))       // NOT triggering
                GlobalAttributeDefaultsTables.Add(AttribTable);
                bLoadedAnyDefaults = true;
            }
            else { UE_LOG(LogTemp, Warning, TEXT("INIT: Valid table failed to provide valid AttribTable...")) }    // currently triggers, twice
        }
    }

    if (bLoadedAnyDefaults)
    {
        UE_LOG(LogTemp, Warning, TEXT("INIT: Loaded any defaults!"))
        // Subscribe for reimports if in the editor
#if WITH_EDITOR
        if (GIsEditor && !RegisteredReimportCallback)
        {
//            GEditor->OnObjectReimported().AddUObject(this, &UAbilitySystemGlobals::OnTableReimported);
            RegisteredReimportCallback = true;
        }
#endif


        ReloadAttributeDefaults();
    }
    else { UE_LOG(LogTemp, Warning, TEXT("INIT: Failed to load any defaults...")) }            // currently triggers
}

With that code, when I turn on Play in Editor, my log has these messages:
LogTemp: Warning: INIT: Starting loop for global curve tables…
LogTemp: Warning: INIT: Found valid table!
LogTemp: Warning: INIT: Valid table failed to provide valid AttribTable…
LogTemp: Warning: INIT: Starting loop for global curve tables…
LogTemp: Warning: INIT: Found valid table!
LogTemp: Warning: INIT: Valid table failed to provide valid AttribTable…
LogTemp: Warning: INIT: Failed to load any defaults…

I am really stuck here. I’ve never messed around with loading files from ini files in UE4 before, and I’m really out of ideas on how to troubleshoot this and fix it.

Any help would be immensely appreciated!!

One more update…

So, I solved that problem - the config path I needed was “/Game/GameplayAbilities/AttributeTables/BAS_Curve.BAS_Curve”
Also opened up my table so instead of being Default.Health it is Default.Basic.Health (comes from the attribute set AttributeSet_Basic)

And I’ve messed around a thousand times more, and finally there is just one thing that is stopping me. It’s got the data on the table and everything, but for some reason I have a compile error with the AllocAttributeSetInitter() override.


void UMLSTAbilitySystemGlobals::AllocAttributeSetInitter()
{
    Super::AllocAttributeSetInitter();
}

If I do the Super, everything works, but there’s a problem because it’s using the AttributeSetInitterDiscreteLevels instead of my own init setter. (Which is the whole point to change!)

However, when I take the exact same code from the AbiltySystemGlobals.cpp and put it into mine, as follows:


{
    GlobalAttributeSetInitter = TSharedPtr<FAttributeSetInitter>(new FAttributeSetInitterDiscreteLevels FAttributeSetInitterDiscreteLevels());
}

I get a compile error:
D:\Unreal Projects\Maelstrom\Source\Zangies\Abilities\MLSTAbilitySystemGlobals.cpp(342) : error C2146: syntax error: missing ‘)’ before identifier ‘FAttributeSetInitterDiscreteLevels’
D:\Unreal Projects\Maelstrom\Source\Zangies\Abilities\MLSTAbilitySystemGlobals.cpp(342) : error C2146: syntax error: missing ‘;’ before identifier ‘FAttributeSetInitterDiscreteLevels’
D:\Unreal Projects\Maelstrom\Source\Zangies\Abilities\MLSTAbilitySystemGlobals.cpp(342) : error C2059: syntax error: ‘)’

This happens whether I’m using the regular DiscreteLevels one, or my own initter. I have worked to have every possible #include I can, I’ve tried forward declaring them ((new struct FAttributeSet…) but it doesn’t change it.

If I change it to this:


{
    GlobalAttributeSetInitter = TSharedPtr<FAttributeSetInitter>(new FAttributeSetInitterDiscreteLevels());
}

it compiles, but then the game crashes instantly on play.

Literally every inch of this process has not worked as it’s supposed to. >.<

Hey @SamPanda - I somehow managed to miss these posts. I know this is quite late but if you haven’t solved your issue yet let me know and we can take a look at it. In the meantime I’ve figured I’d share another useful function that we’ve been using internally on our project. The context is as follows:

  1. We have stat bars on our HUD that represent our player attributes
  2. We want these to interpolate smoothly
  3. This is difficult because there can be any number of gameplay effects modifying the attribute in question (I’ll use stamina as an example)
  4. If Stamina recovery ticks every 2s and gives the player 20 stamina, we need to interpolate at a speed of 10 stamina per second in order to stay in sync with the actual stamina value
  5. This is problematic of course because there can be a debuff that drains stamina or anything really that modifies the attribute

In order to solve this problem we’ve implemented a function to figure out when the next stat update is going to occur and make sure our stat bar value interpolates at a speed that will ensure it matches the real attribute value at the time when the next update hits. The function to get the next update time of an attribute looks like this:



float UMyAbilitySystemComponent::K2_GetNextAttributeChangeTime(const FGameplayAttribute Attribute) const
{
    return GetNextAttributeChangeTime(Attribute);
}

float UMyAbilitySystemComponent::GetNextAttributeChangeTime(const FGameplayAttribute& Attribute) const
{
    float NextPeriod, Duration;
    FGameplayEffectQuery Query;
    Query.ModifyingAttribute = Attribute;

    if (GetActiveEffectsNextPeriodAndDuration(Query, NextPeriod, Duration))
    {
        return NextPeriod;
    }

    return -1.f;
}

bool UMyAbilitySystemComponent::GetActiveEffectsNextPeriodAndDuration(const FGameplayEffectQuery& Query, 
    float& NextPeriod, float& Duration) const
{
    const TArray<FActiveGameplayEffectHandle> ActiveEffects = GetActiveEffects(Query);

    bool bFoundSomething = false;
    float MinPeriod = TNumericLimits<float>::Max();
    float MaxEndTime = -1.f;

    UWorld* World = GetWorld();
    if (World == nullptr)
    {
        return false;
    }

    FTimerManager& WTM = World->GetTimerManager();

    for (const FActiveGameplayEffectHandle& Handle : ActiveEffects)
    {
        const FActiveGameplayEffect& Effect = *ActiveGameplayEffects.GetActiveGameplayEffect(Handle);
        if (!Query.Matches(Effect))
        {
            continue;
        }

        float ThisEndTime = Effect.GetEndTime();

        float ThisPeriod = WTM.GetTimerRemaining(Effect.PeriodHandle);
        if (ThisPeriod <= UGameplayEffect::INFINITE_DURATION)
        {
            // This effect has no period, check how long it has remaining
            float ThisTimeRemaining = Effect.GetTimeRemaining(World->GetTimeSeconds());
            if (ThisTimeRemaining <= UGameplayEffect::INFINITE_DURATION)
            {
                //It's neither period nor has a duration, not interested.
                continue;
            }

            bFoundSomething = true;
            MinPeriod = FMath::Min(ThisTimeRemaining, MinPeriod);
        }
        else
        {
            bFoundSomething = true;
            MinPeriod = FMath::Min(ThisPeriod, MinPeriod);
        }

        if (ThisEndTime > MaxEndTime)
        {
            MaxEndTime = ThisEndTime;
            Duration = Effect.GetDuration();
        }

    }

    NextPeriod = MinPeriod;

    return bFoundSomething;
}



Hope this helps someone!

Thanks for this guide, very helpful. I recently inherited a code base using this GameplayAbilities system so your analysis has been critical to my understanding. I recently updated to 4.22 and have run into an issue because the CurveTable API was updated to include FRealCurve and FSimpleCurve in addition to FRichCurve.

The engine is failing this assertion:


const TMap<FName, FRichCurve*>& GetRichCurveRowMap() const { check(CurveTableMode != ECurveTableMode::SimpleCurves); return *reinterpret_cast<const TMap<FName, FRichCurve*>*>(&RowMap); }

It seems my CurveTable assets generated from .csv are of ECurveTableMode Simple Curves, so the assertion fails. The game crashes in PIE because GetRichCurveRowMap() is being called somewhere in the engine code for GameplayAbilities. Any ideas how I might fix this?

Hey @kigbariom - glad that this helped. Yes, with the 4.22 update they have changed the API for querying curves. I just went back to their FAttributeSetInitterDiscreteLevels and airlifted the code from there. The updated code is below. I am going on memory here so if I miss some line that changed just take a look at the default initter or ask here.


//Iterating over all the curves and getting the row names
for (const TPair<FName, FRealCurve*>& CurveRow : CurTable->GetRowMap())
{
            FString RowName = CurveRow.Key.ToString();
            ...

           //Getting the last level, i.e. the maximum X axis value of the curve
           int32 LastLevel = Curve->GetKeyTime(Curve->GetLastKeyHandle());

            //Iterating over the keys of the curve
            for (auto KeyIter = Curve->GetKeyHandleIterator(); KeyIter; ++KeyIter)
            {
                const FKeyHandle& KeyHandle = *KeyIter;
                TPair<float, float> LevelValuePair = Curve->GetKeyTimeValuePair(KeyHandle);
                int32 Level = LevelValuePair.Key;
                float Value = LevelValuePair.Value;
                ...
            }
}


​​​​​​​I think those are all of the changes they made.

Hello @DamirH !
First of all, thanks a lot for your great posts, I will probably use GameplayAbilities soon in my project. I had a very important question before that though and you seem to be the best person to answer it according to your replies in the original thread : is this system suitable for BTA games like Bayonetta/Devil May Cry and the likes ?

I planned to do a combo system which heavily relies on Animation Notifications, kinda like what Platinum Games did for Nier Automata, using some flags and such, and I was wondering if this approach was compatible with GameplayAbilities.

Well, consider we’re doing something not unlike that with our own game, yes it can be adapted to this. The system is very generic and you will have to define A LOT of game specific stuff, but it can work.

Thanks a lot ! Good luck for your project :slight_smile:

Our project requires each FGameplayAttributeData within an AttributeSet to update its current value based on a point within a CurveFloat asset. This is so we can have custom curves for every attribute for characters, the inTime for the CurveFloat lookup will be the characters “level”.

It seems we need to create our own version of FGameplayAttributeData that returns CurrentValue based on the lookup in a float curve defined on the character, likely in a datatable, and possibly create our own FAttributeSetInitter as well for the default value lookup. Has anyone attempted something like this before, or have any advice? I’m a bit concerned with the performance for this, since the “level” will be constantly changing, and I noticed the CurveTable is preloaded in the default system.

You can write an initter any way you want, including reading an analogue value rather than a set of discreet values. Mind you if you want to use something other than curve data tables you will have to override UAbilitySystemGlobals and change what it loads up into the initter. However depending on how often you actually need to run your growth function it will probably cause performance issues. You could set up a timer where if your character has grown you run the initter every couple of seconds or whatever value works for your growth speed.

Alternatively, the initter is only as complex as its InitAttributeSetDefaults function. If you make that optimized it should work just fine for you. Of course you don’t even need to use the pre-existing InitAttributeSetDefaults override and could create your own UpdateAttributeValuesBasedOnGrowth and then do the bare-minimum operations there to update your values (rather than the full group lookup etc.).

Whatever you do you don’t want to do it on tick, you need to have a timer for it even if that timer is 0.2 seconds.

This thread is indeed the most helpful one, providing details and thoughts behind the code.
Really hope it grows into a real “series”, including chained basic attacks, spells, gameplay tags, cues, etc…

And also, I think people using GAS are using online subsystems for the most cases. The experiences about creating online battle games would be appreciated too.

Thanks for the compliment. I can definitely do a writeup on our process of how we actually translate input to attacks and how we chain attacks, but there definitely won’t be anything about online stuff because we strictly do single-player games.

I’ve just started looking at this thread and implemented my own FAttributeSetInitter, but I am also struggling to understand the issue with regard to the POD check.

My AttributeSet is using FGameplayAttributeData rather than a float. I essentially copied the FAttributeSetInitter code wihtout that check and it appears (on face value) to work OK?

I’m very much a novice so I’m not entirely sure what the issue is here… sorry if this should be obvious, but I’m just not getting it.

Can anyone elaborate as to why that check exists and what the ramifications are with removing it if your Attributes are FGameplayAttributeData rather than a float?