For our game, we’re using a custom virtual joystick on a touchscreen. What I want to do is expose the inputs for this joystick as action and axis bindings. This would allow our designers to grab them in Blueprint and use them to code the logic for our player avatar. It would also decouple the input source from the gameplay logic, which means we could trigger the input using other methods in the future.
We’re not using the default virtual joystick because we want to support things like dodge on swipe and activating combos with multiple presses
Unfortunately, looking through the documentation and the code, there doesn’t seem to be a way for me to implement this without heavily modifying the input systems.
It’s currently very easy to bind a digital or axis action to an input device source (i.e. key pressed or gamepad joystick axis), but impossible to trigger the same action from code.
I have two potential solutions to my problem.
The first is that I add Blueprint events to my player controller and disregard the actual action binding, like so:
UFUNCTION(BlueprintImplementableEvent)
void ActionPlayerMovement(const FVector2D& Velocity);
Our designers will then have to reference the player controller in Blueprint and use these events for input processing. This is not ideal, because it sidesteps the action binding system. This would make it harder for us to trigger the same inputs through different methods at a later stage in our project.
The second option is that I modify the UPlayerInput
class to allow action binding triggering from C++.
In UPlayerInput::ProcessInputStack
, the input component stack is traversed, action bindings are converted to key chords and these are checked:
// Walk the stack, top to bottom
for ( ; StackIndex >= 0; --StackIndex)
{
UInputComponent* const IC = InputComponentStack[StackIndex];
if (IC)
{
check(!KeysToConsume.Num() && !FoundChords.Num() && !EventIndices.Num())
for (int32 ActionIndex=0; ActionIndex<IC->GetNumActionBindings(); ++ActionIndex)
{
GetChordsForAction(IC->GetActionBinding(ActionIndex), bGamePaused, FoundChords, KeysToConsume);
}
for (int32 KeyIndex=0; KeyIndex<IC->KeyBindings.Num(); ++KeyIndex)
{
GetChordForKey(IC->KeyBindings[KeyIndex], bGamePaused, FoundChords, KeysToConsume);
}
FoundChords.Sort(FDelegateDispatchDetailsSorter());
// snip
}
}
// snip
// Dispatch the delegates in the order they occurred
NonAxisDelegates.Sort(FDelegateDispatchDetailsSorter());
for (const FDelegateDispatchDetails& Details : NonAxisDelegates)
{
if (Details.ActionDelegate.IsBound())
{
Details.ActionDelegate.Execute(Details.Chord.Key);
}
else if (Details.TouchDelegate.IsBound())
{
Details.TouchDelegate.Execute(Details.FingerIndex, Details.TouchLocation);
}
else if (Details.GestureDelegate.IsBound())
{
Details.GestureDelegate.Execute(Details.GestureValue);
}
}
Because the APlayerController
is available at this point, I could create a new virtual method that receives the action or axis binding and outputs a value. This would allow me to implement a custom action handler in our game’s player controller.
Am I missing something? Is there an easier way to get what I want?