Download

Input Action And Axis Mappings

In Unreal Engine 4 we wanted to make binding input events as easy as possible. To that end, we created Input Action and Axis Mappings. While it is certainly valid to bind keys directly to events, I hope I can convince you that using mappings will be the most flexible and convenient way to set up your input.

So what are Action and Axis Mappings?

Action and Axis Mappings provide a mechanism to conveniently map keys and axes to input behaviors by inserting a layer of indirection between the input behavior and the keys that invoke it. Action Mappings are for key presses and releases, while Axis Mappings allow for inputs that have a continuous range.

Why would I want to use a mapping instead of binding directly to the key?

Using input mappings gives you the ability to map multiple keys to the same behavior with a single binding. It also makes remapping which keys are mapped to the behavior easy, both at a project level if you change your mind about default settings, and for a user in a key binding UI. Finally, using input mappings allows you to interpret input keys that aren’t an axis input (e.g. gamepad thumbstick axes which have a range of -1,1]) as components of an axis (e.g. W/S for forward and back in typical FPS controls).

Alright, these sound great. How do I set them up?

In the Input section of Engine Project Settings you can see the list of existing mappings and create new ones.

https://docs.unrealengine.com/latest/images/Programming/Gameplay/Framework/Input/AxisMappings.jpg

Actions are pretty straightforward: give the action a name, add the keys you want mapped to the action, and specify which modifier keys need to be held when the key is pressed. Axis mappings are also reasonably straightforward. Instead of specifying any modifier keys, however, you specify a Scale. The Scale is a multiplier on the value of the key when summing up the Axis’ value. This is particularly useful for creating an axis out of keyboard keys (for example, W can represent pressing up on the gamepad stick while S represents pressing down).

Now that I’ve defined some mappings, how do I use these things?

Mappings can be bound to behaviors from both Blueprints and C++.

In C++ you will most typically set up your bindings in the Pawn/Character::SetupPlayerInputComponent or PlayerCharacter::SetupInputComponent functions; however, anywhere you have an InputComponent is valid. The bindings are formed by calling BindAction/Axis on the InputComponent.

InputComponent->BindAxis(“MoveForward”, this, &ASampleCharacter::MoveForward);
InputComponent->BindAction(“Fire”, IE_Pressed, this, &ASampleCharacter::OnBeginFire);
InputComponent->BindAction(“Fire”, IE_Released, this, &ASampleCharacter::OnEndFire);

In Blueprints you can place an Axis or Action Event node from the Input section of the context menu or palette of any Actor blueprint.

CombinedImage.png

In both C++ and Blueprints, Axis events will fire every frame passing the current value of the Axis while Action events will have the Pressed and Released outputs fire as the key(s) are pressed.

An Axis’ value is the sum of the values of each key’s state in that frame. So in the MoveForward case pictured above, if you have only W held down the Axis’ value is 1, but if you had both W and S held down then the Axis’ value would be 0. It should also be noted that if you had both W and Up pressed then the value is 2, so you would likely want to clamp the value in the bound function.

Actions that are bound only to a pressed or released event will fire every time any key that is mapped to it is pressed/released. However, in the case of Paired Actions (actions that have both a pressed and a released function bound to them) we consider the first key to be pressed to have captured the action. Once a key has captured the action the other bound keys’ press and release events will be ignored until the capturing key has been released.

Is that all?

There are a lot of other important input concepts (some of which are covered in the input documentation) such as the input stack, which Actors have input enabled by default and how to enable input for other Actors, and how input consumption works, but we’ll leave diving in to those for another post.

I hope I’ve convinced you that using Action and Axis Mappings will be the best way to set up input in your project, but if not, that’s fine! You can always bind directly to Keys if that’s easier for you and convert to using Actions and Axes when they provide value for you.

If you have any questions feel free to ask below - we’re always happy to help!

Didn’t know that bit, getting back to take care of it right away. Thanks Marc.

Thanks for the tutorial, Marc!

Can you provide some more details on how to implement dynamic remapping? I.e., after the user - using a proper UI - decided that (for example) the “Fire” action must use the “Ctrl” key instead of the default “Spacebar” (that I previously set in the Project Settings), how I can implement it? Looking at the documentation, seems that I must “override” the default “PlayerInput Mapping” system component; is it right? How can this be done?

Thanks!

Thanks for the tutorial! Can you also provide details on how to map Vector axes (Tilt, RotationRate, etc.)? I have looked everywhere I could think of, but I didn’t find any info on using them.

Can we get the correct button mappings for Fire TV in a future release? I was getting overwhelmed with the source code trying to do it myself.

Heh, two days ago I spend the entire day wondering why my character input code did not work. Turned out I hadn’t set up the keybindings in the Project Settings. Wish I had seen this earlier!

There are basically two approaches (both only available through C++) to doing the remapping.

One is to modify the InputSettings directly. Something along the lines of GetMutableDefault<UInputSettings>() and then calling Add/Remove Action/Axis as appropriate, calling Save KeyBindings, and then finally calling ForceRebuildKeyMaps on the PlayerInput to make the settings be applied. This has the advantage of persisting fairly easily across sessions, but has the downside of modifying the Class Default Object directly (something some people find an iffy proposition) and also that it applies globally to all players, so if your game wanted to support multiple profiles or local multiplayer with different bindings that wouldn’t be possible.

The other approach is the one that Fortnite is using. In this case they save in to a custom profile file the actions and axes that the player has changed. Then in FortPlayerController class InitInputSystem is overriden and it goes to the PlayerInput object, and for each changed binding removes the default and adds the user’s chosen key for the binding.

Incorporating a keybinding example in to one of the sample games (probably Shooter) is definitely on the list of things we’d like to do, but I can’t say when it might happen and I’d definitely like to polish up this interface in the future, but hopefully that helps you build a solution at the moment.

The Vector axes aren’t currently exposed as events or the get values the way the Float axes are unfortunately. I have a task on me to do so just need to get some time set aside for that. At the moment your only option is to use the Get Input Vector Key State function on Player Controller with the keys Tilt, Rotation Rate, etc. keys.

I’m not sure what our current plans are towards explicit FireTV support.

Hi Marc! Those are the details that I was missing. Thank you!

I’m having an issue with input right now.

I’m using the Top Down Blueprint Template as a base.

Inputs are only being executed in the PlayerController blueprint that came with the template.

When I assign a MouseWheelUp event for example on a Character Blueprint i have setup as the Default Pawn and is controllable via the PlayerController, it isn’t reading the input from Character only from the PlayerController.

Is there something I’m missing or not checking?

In the end I’m trying to get MouseWheel to adjust the arm attached the Character but I can’t seem to access that info from the PlayerController so that’s why I’m trying to modify it by placing the MouseWheel events in the Character blueprint.

Pawn has priority over playercontroller, so even if you have the same key defined in both, the one in the character will get called, as long as it can receive input of course.

I can’t be more specific about your problem before I see some shots and maybe part of your setup, but an alternative is to put the input event in your PC, and from there call a custom even in player whenever the event was called.

As a content creator I’ll pick up the challenge of supplying some feedback.

First I’m amazed on how easy it is to do one thing or the other as in building the environment and then being able to interact with in that environment but where I feel things are a bit weak is in that space between the two based on the primary interest of the individuals involved in a given project.

A coder for example would take to blueprints like a duck to water yet someone like me as a content creator would have some problems with the ideals as to how connectivity between the art assets they are working can be achieved with little effort even when trying to do something simple as in polishing the required animations.

Key bindings for example is one of those frustrating things as they really don’t have any kind of connectivity between what you see, what you are working on, and what has to be done that disconnects work in progress when migrating, lets say as an example, a player model from one project to another with the expatiation that it will continue to work.

At the moment there is a lot of work that needs to be done, based on the assumption that what will be done is fit to finish work, when there is a need to fill this gap for something simple as a proof of concept as to how interaction needs to take place between content and code based on the vision of the animator.

Just a couple of suggestions as to what I would find useful, and may already be available, is the addition of a few useful tools that are direct as to quickly make something work for the purpose of testing that A) proves that it does work B) You can show someone else that this is the way it should work.

What would be helpful as well is some kind of schematic view, as there is in most 3d applications, of what is the relationship of a given asset with in the scene or browser and where that asset maybe disconnected from being used practically by design. For example it took a while to realize that to spawn a player model you have to include it as the game type.

In this case what would be nice is some kind of visual component, a widget, that can be placed with in the scene that over rides all “game” settings and will span or introduce components like player models that is counter productive to the idea of code to keep things as a variable if just for the purpose of simple testing with out having to take on the full requirements of code or blueprint construction.

The long way around I guess to just say sometimes keeping things simple can be just as useful.

Please, ability to easily let players remap input keys in-game should be built-in in the game engine/editor.
Please Epic address this. It’s a core feature, before reading this topic I was so sure UE4 had it, can’t believe there is no default tool to do this in the editor.

I used the method 1.

but why it can not use my remapped key in current game.

but just it worked in next time I restart the game.

Here is my coding:

I used the method 1.

but why it can not use my remapped key in current game.

but just it worked in next time I restart the game.

Here is my coding:

const UInputSettings* DefaultInputSettings = GetDefault<UInputSettings>();

const FInputActionKeyMapping fInputActionKeyMapping(name, oldKey);
((UInputSettings*)DefaultInputSettings)->RemoveActionMapping(fInputActionKeyMapping);

((UInputSettings*)DefaultInputSettings)->SaveKeyMappings();

aShooterPlayerController->PlayerInput->ForceRebuildingKeyMaps();

I want to use action mappings but I am having a problem where if I change the name of the action mapping, none of the nodes that have already been implemented are updated, so I have to go in and find every instance that has already been input and update it manually. Of course, this means that there is the risk of missing an instance and breaking functionality likely resulting in unnecessary debugging. It would be great if changes to Action Mappings variables would trickle down. Is that something that could be implemented in a future update?

I was looking for two things when I found this thread…

  1. How to change the key mappings in game? Which it seems can’t be done from blueprint templates according to post #7
  2. How do we read what values are already assigned to a binding?

I am trying to make a configuration menu, and both of these things are required for the key binding part of the menu. Seems to me this should be pretty high on the Features-To-Add list, since pretty much all games have a configuration menu.

Is there plans to allow the binding be be changed through blueprints? And is my second question possible in blueprints currently?

It’s a great concept that I’ve been using from the start. So handy to set up various controls like mouse/keyboard vs. gamepad.

Oh, thanks for mentioning that. Would have had a proper cheat on my hands. Is there a particular reason it works this way? I don’t think anyone would expect this when they see the range of -1.0 to 1.0 in the mappings and I can’t really think of a case where this would be useful.

Just to reply to this old thread that there doesn’t seems to have a card on trello for being able to modify key binding from Blueprint.
Hopefully this can get on trello and let people vote for it, thanks.

Hi,

Kindly refer to my thread for dynamic remapping. Kindly let me know if you need further explanation and I will be happy to help.

Hello,

I found an issue with Axis Mapping I guess here would be a good place to post?

When using an axis mapped event, I realized it can only be called in one part of a blue print, if two or more parts of a project share it only one will ever fire off.
This had me pulling my hair our for hours.

Now I realize why this might be useful, but there should be a way to determine which blueprint gets priority maybe?

Now the part that sucks is that the Get function for it also follows this logic, so if I have the event somewhere and I’m using the Get[MapAxis] else where only one will work as well.

I hope my explanation wasn’t too bad, I can take some screens or video if need be.

Also, is there any way to get a global key press and global control press mapping please.

These would be so useful for PC games that allow both game pads and keyboards, it would allow quick interface switching for hints and how to plays.