Unreal Engine Diaries is an episodic series of articles that I started to note down the new concepts that I came across while working on Unreal Engine projects. They’re usually beginner to intermediate level topics and mainly cover the blueprint side of the engine.
Unreal Engine Diaries #1
Basic workflow for creating splines: Create a spline component in your blueprint >> Set it’s world points by passing in the vector data Set Spline Points] >> Add a spline mesh component >> Set static mesh >> Set material for the mesh >> Get the location (and tangents, if necessary) at the stored spline points Get Location and Tangent at Spline Point] and use this data to set start/end points as well as tangent vectors for the spline mesh. [Do a for loop with the spline points data, if you have more than two points]
Select a group of blueprint nodes, right click and select ‘Collapse to function’ to have the editor automatically create a function encompassing those nodes.
It is possible to increase or decrease the rate at which an animation is run, by adjusting the Rate Scale in the details panel of the anim sequence.
Blackboards can be cleared from within blueprints using the ‘Clear Value’ function. This is especially useful in scenarios where you’re dealing with vector blackboards. Setting it to zero in this case is not ideal as it actually represents a location in the game space.
When you’re calling the function ‘Get Random Point in Navigable Radius’, make sure that you store the vector return value inside a variable if you intend to use it in multiple scenarios later down the line. Otherwise, it ends up creating different randomized vectors each time it’s used even though you’re taking the same return value. It’s kind of obvious as those are different function calls, but sometimes in the heat of the battle, it’s easy to overlook small things like these.
Unreal Engine Diaries #2
Useful Material Editor Hotkeys [Press these keys & click anywhere on the mat editor]: B = Bump Offset; E = Power; I = If condition; O = OneMinus; P = Panner; S = Scalar Parameter; U = Texture Coordinates; V = Vector Parameter
If you change any of the member names of a Struct in the Content Browser, the connections from the aforementioned members in all blueprints will get cut. As a result, you’ll have to go around each of these blueprints and manually connect them again. So it helps to actually note down where you need to reconnect before making changes to a Struct.
Also note that in v4.8 of Unreal Engine, these renamed members will have their input pins invisible the next time you check out those structs in your blueprints. In order to make make them visible again, click on the struct >> go to the details panel >> tick the checkbox next to the changed element to see it again. You’ll however have to do this for every instance of the said struct.
Useful Material Nodes:
- The ‘Radial Gradient Exponential’ node can be used to create a circular gradient.
- The ‘Particle Color’ node provides data about the current color of a particle.
- The ‘Depth Fade’ node can be used in materials of particle emitters like Fire to have them smoothly fade away when they come in contact with any meshes. This node when connected to the opacity helps remove the hard edges which would be otherwise evident when a mesh obstructs the emitter.
An Unlit material basically means that we are only dealing with the Emissive and Opacity inputs.
Unreal Engine Diaries #3
The world rotation of actors in a level, when broken down into their x,y,z components lie within the range of (0,180) & (-180,0). When doing calculations based on actor rotation, this change in the values from positive to negative have to be taken into account. If we treat it like normal repeating cycles of (0,360), it might yield strange results in certain types of calculations.
When aborting a subtree in Behavior Trees, any task within that subtree that is actively running at that moment will not be aborted midway. It will see through to it’s completion and if the task has delay conditions or code that changes some important data, this could lead to unexpected results if not taken care of. However it is possible to shut down these tasks at runtime through the use of conditional checks that gauge the current state of the game based on some external variable or blackboard values. Once we have determined that the task is to be aborted, we just call the ‘Finish Execute’ node to stop and get out of the task.
If we’re displaying the game over screen as a widget that’s added on to the viewport while the game is running, make sure that the game is paused using the ‘Set Game Paused’ command. Not doing this would mean that the actors in the level are continuously being updated in the background. Now sometimes it’s fine to have the enemy actors move around the screen in the background, but even in those scenarios, it’d be a good practice to make sure that any constantly updating game element that’s part of the player character/controller are turned off. An easy example to think of would be an actor in the level that is responding to the mouse cursor. So it might move around the screen, even while we’re trying to click that restart button.
When creating a widget from within a task in a Behavior Tree, it’s a good idea to make sure that it’s not being called continuously. It’s generally better to create widgets outside the task flow, within normal blueprints, but certain situations might demand widget construction from within Behavior trees in order to use some of it’s functionalities that are not natively available in the blueprints. In such situations, there is a way to handle UI changes from the behavior tree tasks. Just add a ‘Do Once’ node inside the task before calling the widget construction logic. This makes sure that the subsequent iterations of the task don’t create the widget unless explicitly specified.
In my project, I’ve used this only in an end game scenario as one of the conditions for it was handled from within a Behavior tree task. It has since been replaced with a game pause call. So this makes sure that the Behavior Tree stops executing altogether, but the ‘Do Once’ node might be useful in other situations where you can’t pause the game.
Unreal Engine Diaries #4
The ‘Particle Position’ node in the material editor can be used to get the location of a particle in space.
The ‘Bind Event to ReceiveMoveCompleted’ in the AI controller can be used to automatically receive the status of a bot once it has completed it’s movement order. It’s got different states like success, aborted, blocked, invalid, etc and these can be used to have the AI respond to different situations with different behaviors. But if we have multiple events waiting for it’s reply from different places, like say from different tasks in a Behavior Tree, all these bound events will start executing. And that might not be a desirable outcome. So in order to work around such a scenario, it would be a good idea to have a check on the current state of the bot in all these events and proceed based on that result. This could help ensure that even thought multiple events maybe fired, only the one that is suitable for the current state of the AI will see through to it’s completion.
When using the ‘OnPerceptionUpdated’ event to detect potential targets, you may have noticed it does not give any details regarding the location of the source of the stimuli. But there is actually a method to retrieve this data. Just loop through this array of actors and for each actor, get it’s actors perception ‘Get Actors Perception’ node], then break down it’s output ‘info’ struct and loop through the ‘Info Last Sensed Stimuli’ to get all the necessary details like stimulus location, age, tag, etc.
In the material editor, we can add the ‘Noise’ node to display noisy patterns on the surface of a mesh. However it’s very expensive and hence best used for prototyping or in rare scenarios where using an actual texture isn’t justified.
Press ‘Alt+C’ in the map editor to see the collision data for all the meshes.
In Unreal Engine, most of the physics calculations are done by the CPU.
Unreal Engine Diaries #5
The ‘profilegpu’ console command can be used to profile the GPU for a single frame.
The ‘r.xxxxx ?’ command can be used to get the tool tip for the particular rendering command that is being passed on as parameter.
Shaders can get really expensive when using World Position Offset and Tessellation. And speaking of World Position Offset, it can be used to manipulate the vertices of a mesh from it’s material.
On mobile platforms, it’s better to use translucent materials when compared to masked materials.
If there are lots of skeletal meshes in a game, the ‘SkinnedMeshComp Tick’ can get expensive. Reducing the number of bones of the skeletons or the complexity of the anim blueprints can help improve the performance in these scenarios. Also if you do not need the animation to update when you can’t see the skeletal mesh consider setting the ‘Mesh Component Update Flag’ in the details panel of skeletal mesh component to ‘Only Tick Pose when Rendered’.
The ‘Smoothed Frame Rate’ option in the Project Settings [under General Settings category] is used to render a No VSync frame capped version of the rendering. While doing GPU Profiling, it’s a good practice to test it out without using this Smoothed Frame Rate.
Unreal Engine Diaries #6
In the editor, you can select actors from the ‘World Outliner’ panel, right click and select ‘Move To’ >> ‘Create New Folder’ to group your actors into a folders.
The ‘Project World to Screen’ function can be used to check if any point in world space lies within the screen space of a player’s camera view. Just pass on the said world location and the player controller reference as input parameters and you can get the corresponding screen position data as the output. Break this struct into it’s x and y values, then use the ‘Get Viewport Size’ node to get the actual screen bounds and check it the aforementioned screen position values lie within 0 and the screen bounds values that we just received using the viewport size. If both x and y values lie within this range, then the point is within the visible screen space, else it’s outside the camera view.
When adding a vector to an actor’s world space location to get the vector result of a location near the actor, do not add them using the values that you’d want to increase in the x, y and z directions. It only works in the relative location calculations. What you see as the forward direction in the actor blueprint viewport may not be the same as the forward direction in the world. So in this case, what we need to do is get the forward, right and up vectors. Then multiply them with the required distance along each direction and add/subtract this vectors from the main world space location.
The console commands ‘stat startfile’ and ‘stat stopfile’ can be used to record the performance stats of all gameplay logic that happens between the commands. On completion, it saves the data to a new file in the HDD. In order to retrieve this data, go to the ‘Windows’ tab in the editor >> Developer Tools >> Session FrontEnd >> Profiler Tab and click on the ‘Load’ Button. It’ll take you to the folder location where the file was saved. Open the most recent file in the folder to see the visual representation of the performance stats of the CPU [Game & Rendering thread] as a graph in the editor. Select any part of the graph where it’s spiking and see all the game logic and rendering processes that’s being called within that timeframe, to get an idea of what’s causing the performance drops in your project.
Some very nice tips, especially the ‘Project World to Screen’ one.
That one recently helped me in doing a lot of optimization by stopping calculations for certain objects that were off screen.
Unreal Engine Diaries #7
While working on the Top Down Stealth Toolkit, I noticed that sometimes the character animations that worked in the PIE mode did not work in the Standalone mode. One of the solutions that worked for me was to connect the ‘Event Blueprint Update Animations’ in all the child anim BPs to their parent update animation events.
To find the angle between two rotator variables, it is better not to use normal subtraction to get the difference as this could give odd results in certain scenarios owing to the fact that the rotator values for actor rotation and world rotation follow the (0,180) & (-180,0) range. For example, when you add two positive values, it could produce a negative output and vice versa. In order to work around this situation, the ‘Delta (Rotator)’ node can be used to get the absolute angular difference between the two rotators.
When working on Top Down games, the ‘Orient rotation to movement’ parameter in the character movement component of the player character can be unticked to have it face the direction of mouse cursor instead of the movement direction.
The following method can be used to get the dimensions of a landscape actor:
- First create a reference to the landscape actor either through the level blueprint or using the ‘Get All Actors of Class’ function.
- Get the landscape actor reference and then use ‘Get Actor Bounds’ function to get the box extent.
- Break the box extent vector into it’s float values representing the magnitude on each axis and then multiply each by 2 in order to get the length, width and height of the landscape actor.
In the default First Person template, if we do a line trace towards the world space equivalent of the center of the screen, it could be seen that the impact location of the trace and the crosshair location on the screen are offset by a certain amount. This is because the logic used to draw the crosshair on the screen from the HUD class does not take the texture size of the crosshair into account during calculations. To rectify this issue and display the crosshair at the true center, we can subtract both x and y location by half the corresponding dimensions of the texture used for crosshair, before plugging it into the draw texture function. In the default case, that would mean subtracting both by 8 units. Doing so should make sure that the trace hit locations match with the crosshair location. ExtendedFPSTemplate_PreciseAim ProjectFiles Link: https://github.com/Stormrage256/ExtendedFirstPersonTemplate–PreciseAiming]
Unreal Engine Diaries #8
When adding new input parameters to a function that’s already being called multiple times throughout the project, it’s always better to immediately check every instance of the function call to make sure that the new input parameter is connected as required.
Drag & drop a variable from the variables list onto a get/set node of another variable to automatically replace the second variable with the first.
When attaching moving physics actors to the player character without physics handles, disable it’s gravity & set the linear/angular velocities of all of it’s components to zero in order to have it simulate physics & collision on the move.
Under default conditions, when a character changes it’s direction of movement, it instantaneously turns to face the new direction. To change this behavior and enable smooth rotation based on direction changes, first go to the AI character blueprint >> Character Movement Component >> Enable “Orient Rotation to Movement” & set “Yaw” of the “Rotation Rate” based on how smooth the bot turning movement has to be. Then under the default attributes of the blueprint, disable “Use Controller Rotation Yaw” and it should now start having smoother turning movements.
If you’re experiencing automatic brightness changes in your game, you can disable this effect by going to your viewing camera component >> Post process settings >> Auto Exposure >> Set min and max brightness to the same value.
Unreal Engine Diaries #9
‘Shift + F1’ can be used to gain mouse control and jump between the different game instance windows during multiplayer testing in the editor.
While working on VR, try to match the size of in game objects to their real life counterparts, as not doing so could make them stand out and reduce the immersion.
In the Material Editor details panel, turn on ‘Fully Rough’ [prevents reflection rendering pipelines from executing] & turn off ‘Light Map Directionality’ [both under the the ‘Mobile’ category] to make materials that are less expensive to render. This is a pretty good option when dealing with far away objects in the level that do not require a lot of detail. Setting the Shading Model to ‘Unlit’ can also increase performance in instances where the additional detail is not required.
In PIE mode, press ‘Ctrl + Shift + .’ to bring up the GPU Profile. It would be a good idea to start looking for elements that cost more than a millisecond.
‘Switch has Authority’ can be used to determine who is executing the script: the server or the client.
Unreal Engine Diaries #10
To display the AI Perception range in the editor, go to Editor Preferences >> General >> Gameplay Debugger & tick the ‘Perception’ parameter. Also in post processing, set AA to ‘FXAA’ or none to display the debug lines better.
In the widget blueprint, select multiple widgets from the Hierarchy panel & then Right click >> ‘Wrap with’ to wrap the selected widgets within another widget like Canvas Panel, Border, etc.
Add ‘Rotating Movement’ component to actors to have them rotate automatically. The rotation rate for each axis can be set from the component. This could be used to create interactive objects like weapon pick ups in a shooter or effects like rotating coins in a side scrolling game.
Wrapping widgets with ‘Invalidation Boxes’ can be used to increase performance as they get executed only at the time of creation & not every frame unlike other widgets. This could be especially useful when there are lots of static UI elements that do not get updated at run time.
The ‘Random unit vector in cone’ node can be used to get random line trace target locations for creating shotgun spread patterns.
Perfect for newcomers
Thanks Zarkopafilis. ![]()
Unreal Engine Diaries #11
[This episode of Unreal Engine Diaries focuses primarily on nav mesh configuration, nav areas & configuration of agent types.
Source: Unreal Engine 4 AI Support Twitch Broadcast with Mieszko: https://www.youtube.com/watch?v=7LaazCv4rB0]
-
The Recast NavMesh has a couple of attributes named ‘CellSize’ & ‘CellHeight’ under the ‘Generation’ section in it’s details panel. Together they determine the resolution of the nav mesh & lowering these values can create more precise nav meshes. It could be especially useful when there are lot of holes in the nav mesh due to the surface properties of the terrain. However, lowering them also makes the nav mesh calculations more expensive.
-
If runtime nav mesh generation is enabled, it would be best to set the ‘Tile Size’ attribute of Recast NavMesh to the minimum viable amount.
-
The ‘Min Region Area’ parameter which can also be found under ‘Generation’ section of Recast NavMesh can be increased to get rid of small nav mesh islands that are isolated from the rest of the nav mesh regions.
-
When moving around objects that can influence the nav mesh with the runtime nav mesh generation enabled, it is far more efficient to use the actors as dynamic obstacles (with custom nav area settings) when compared to moving around actors in their default state. For static meshes this can be done from the static mesh editor as shown:
Just tick the ‘Is Dynamic Obstacle?’ checkbox & set the nav area type from the dropdown right above it. Alternatively, the same can be set up from the collision components in actor classes as shown below: [same procedure as mentioned in the static mesh editor workflow]
- If you have tall characters in your game that are having issues with the nav mesh pathing, go to ‘Navigation System’ category under Project Settings. Add a new ‘Supported Agents’ & then try increasing the z value of ‘Default Query Extent’ of the new agent.
On the other hand, in order to create separate nav mesh pathing for the bigger AI characters, add a new element to ‘Supported agents’ array & specify custom agent radius, query extent, agent height, etc. Build paths again to see a new nav mesh recast in the Scene Outliner tab. The ‘Enable Drawing’ parameter under Recast Navmesh can be ticked On to visualize the new nav mesh. It seems that the navigation system will automatically determine which agent type is suitable for the bot based on it’s collision capsule size.
Unreal Engine Diaries #12: Basics of Particle Systems [Part I]
Disclaimer: Most of these points are based on the information provided in the official Unreal Engine tutorials on Particle Systems. It’s basically a consolidated collection of notes about concepts that I found new or interesting from the video series, which can be found here: https://www.youtube.com/playlist?list=PLZlv_N0_O1gYDLyB3LVfjYIcbBe8NqR8t]
-
A Particle can be considered as a point in space that is being manipulated by a set of mathematical calculations.
-
A Particle System refers to the asset that lies in the Content Browser. The Emitter actor is an object that resides in the scene that holds a reference to the particle system.
-
The ‘Template’ property of an Emitter under the ‘Particles’ section determine which particle system is being referenced by the emitter.
-
The Particle System Component is a special type of object that can be used as modular parts within actors. When an actor is spawned in the level, the components attached to it exist in the level as well. Similar to the Emitter actor, the components can also be set to hold a reference to particle systems using their ‘Template’ property.
-
Within the Cascade particle system editor, ‘Emitters’ can be treated as individual aspects of the system that create different types of effects to make a combined whole, that’s a sum of it’s parts. The ‘Modules’ are basically modular components of an emitter that define a singular aspect of the behavior of the said emitter. Certain modules like the ‘Required’ & ‘Spawn’ modules are available by default on emitters & define their core attributes, while the others can be swapped in or out based on the requirements. The modules can also be moved & copied between different emitters in a particle system.
-
Within a particle system, the modules execute from top to bottom, while the emitters are calculated from left to right.
[To be continued in Part II]
