Unreal Engine 4.16 includes exciting new rendering and animation features, significant performance improvements for mobile and console platforms, and tons of quality of life enhancements that will make it even easier to make stunning environments and engaging experiences that run smoothly on a wider variety of platforms.
Enhance the mood of your environments using the amazing new Volumetric Fog feature, which can be enabled to automatically render realistic fog and smoke effects with consistent lighting anywhere in a scene, even at a large scale.
Breathe life into your characters using new dynamic lightweight rigid body and low level cloth simulation tools! Take greater control of the flow of movement using Animation Modifiers, spline IK solver, updated Pose Driver, and many other improvements to the Animation system.
Garbage Collection is now twice as fast! UI rendering performance and UMG widget creation speed are vastly improved to enable you to create even more compelling interfaces. Interfaces and workflows for VR Mode, Animation, Sequencer, and other tools have been updated to make your development process more streamlined than ever before.
Support for Nintendo Switch is fully-featured and ready for production in 4.16! Epic Games has teamed up with Nintendo to release the full UE4 source code for Nintendo Switch to approved developers for free. To learn more about how to get started, read more here
DirectX 12 is now the default renderer for Xbox One, bringing both performance and feature enhancements to platform support in the engine. In addition, you can now develop HTML5 games using WebAssembly and WebGL 2, and this new path will continue to improve in UE4.
For mobile, the Android virtual keyboard is now supported, and runtime permissions have been exposed to both Blueprint and code. Plus, we have made even more strides to reduce executable sizes for mobile apps!
In addition to hundreds of updates shipping from Epic, this release includes 160 improvements submitted by the incredible community of Unreal Engine developers on GitHub! Thanks to each of these contributors to Unreal Engine 4.16:
0lento, Akihiro Kayama (kayama-shift), Alice Robinson (Valkrysa), Altrue, Andreas Rønning (Sunjammer), Andrew Gaubatz (e-agaubatz), Angus Jones (crumblycake), Artem V. Navrotskiy (bozaro), Black Phoenix (PhoenixBlack), Cedric Neukirchen (eXifreXi), Cengiz Terzibas (yaakuro), Chris Varnz (chrisvarns), Christopher P. Yarger (cpyarger), Damian Nowakowski (zompi2), DarkSlot, DeanoC, Derek van Vliet (derekvanvliet), devbm, dodgyville, drelidan7, Gabriel Lima (Gabriel-Lima-O), Gyeonghwan (conquests), Hao Wang (haowang1013), Ilya (ill), Jackblue (JohnsonJackblue), James Horsley (mmdanggg2), Jeff Rous (JeffRous), Jon Watte (jwatte), Jørgen P. Tjernø (jorgenpt), jostster, Kalle Hämäläinen (kallehamalainen), katze7514, Kevin Kuegler (FrostByteGER), KrisRedbeard, looterz, Manmohan Bishnoi (manmohanbishnoi), Marat Radchenko (slonopotamus), Markyroson, Martin Treacy-Schwartz (the1schwartz), Matt Edmonds (cleaver404), Matthew Casey (mdcasey), Matthias (haimat), Matthias Hölzl (hoelzl), Matthias Huerbe (MatzeOGH), Michael Schoell (MichaelSchoell), Michał Siejak (Nadrin), Milan Šťastný (aknarts), Moritz Wundke (moritz-wundke), Mustafa TOP (MSTF), Narendra Umate (ardneran), Nathan Stocks (CleanCut), NaturalMotionTechnology, Nick Verenik (nverenik), Paul Murray (awesomeness872), pfontain, Phil Christensen (Rastaban), PrimalJohnScott, projectgheist, Rafael Ortis (rafortis), Rajko Stojadinovic (rajkosto), Rama (EverNewJoy), rhughesgeomerics, Ricardo Rodrigues (RicardoEPRodrigues), Robert Hagglund (hagglund), Robert Segal (robertfsegal), Ryan Pavlik (rpav), sangpan, Sanjay Nambiar (sanjay-nambiar), Satheesh (ryanjon2040), Sean Campbell (scampVR), Sebastian Axinte (ENiGMA9), Sébastien Rombauts (SRombauts), SiebenCorgie, Stefan Zimecki (stefanzimecki), StefanoProsperi, Stephen Johnson (megasjay), TaeYoung Cho (valval88), Timothee Besset (TTimo), Timothy Hagberg (thagberg), Tom Kneiphof (tomix1024), Tom Ward (tomwardio), TRS-justing, unwitherer, Vladimir (VladimirPobedinskiy), Vladimir Alyamkin (ufna), wyhily2010, Yaroslav Shmelev (SoulSharer), yeonseok-yi
**Major Features **
New: Volumetric Fog
Create incredible ambience and mood in your environments using the new Volumetric Fog! Varying densities are supported so you can simulate clouds of dust or smoke flowing through light shafts, and any number of lights can affect the Volumetric Fog.
Volumetric Fog supports lighting from:
- A single Directional Light, with shadowing from Cascaded Shadow Maps or static shadowing, with a Light Function
- Any number of point and spot lights, with dynamic or static shadowing if ‘Cast Volumetric Shadow’ is enabled
- A single Skylight, with shadowing from Distance Field Ambient Occlusion if enabled
- Particle Lights, if ‘Volumetric Scattering Intensity’ is greater than 0
You can use Materials applied to Particle Systems to control Volumetric Fog with the new Volume Domain setting. A single particle with a Volume Material causes a sphere of density to be added to the Volumetric Fog. The effect is fully 3D with no billboards involved. Multiple spherical fog particles with noise from textures can be used to limit fog to a certain area.
For information on setting up Volumetric Fog, see the documentation.
**New: Image-Based (FFT) Convolution for Bloom **
Create physically-realistic bloom post-process effects using the new image-based (FFT) convolution feature! Unreal Engine 4.16 ships with a FFT Bloom that empowers artists to use custom bloom kernel shapes, with total control over the intensity in order to match the results they imagine.
By using a mathematical convolution of the source image with a kernel image, this bloom technique can produce a continuum of responses ranging from star-like bursts to diffuse glowing regions. The additional realism generated by the image-based convolution is the result of its ability to use visually interesting, non-symmetric kernel images. It generally looks like a star-burst with radial streaks, but could include eyelash silhouettes, bokeh or other artifacts.
Note: Image-based convolution Bloom is designed for use in cinematics or on high-end hardware, while the pre-existing (standard) Bloom should be used for most game applications.
New: Distance Field Lighting Optimizations
Distance Field Ambient Occlusion and Ray Traced Distance Field Shadows are now 30-50% faster on current generation consoles and mid-spec PC! These features allow for more realistic ambient lighting and area shadows on dynamic meshes in your scene.
In addition, static mesh Distance Field Generation is 2.5 times faster, thanks to acceleration from Intel’s Embree ray tracing library. Memory usage is also significantly reduced when enabling the Eight Bit Mesh Distance Fields and Compress Mesh Distance Fields project settings.
New: Lightweight Rigid Body Simulation
Create hordes of physically-simulated characters with the new lightweight rigid body character simulation! You can now simulate a Physics Asset inside your Animation Blueprint using a new high-performance immediate mode PhysX API. Characters using this simulation can also generate collision with static geometry in the world.
New: Low-level Clothing Simulation
Gain more control over clothing simulations using the new low-level NVIDIA NvCloth clothing solver!
We have replaced the APEX clothing solver with a lower level solution called NvCloth from NVIDIA. The new solver is similar to the core solver of the previous APEX solution with few slight behavior changes, and it provides better access to the simulation data and extra exposed parameters for inertia settings.
New: Release Games on Nintendo Switch!
Registered developers can now build and release games for the Nintendo Switch! Unreal Engine 4’s production-ready Nintendo Switch support is certification compliant, enables networked multiplayer, and provides access to multiple rendering pipelines - deferred, mobile forward, and clustered forward - to enable you to ship virtually any type of game for Nintendo Switch.
New: VR Mode UI and Interaction Updates
VR Mode in Unreal Editor has been overhauled to provide a more intuitive workflow and editing experience!
A new asymmetrical controller setup puts a new and improved Radial Menu on one hand and an interaction laser with improved precision on the other to make working with objects in your level quick and easy.
All VR Mode actions, including all major editor features and UI panels, are now accessed from the updated Radial Menu. Teleport has been updated so that you can instantly move to a location and resize to the default scale to see the player’s perspective as well. For more information, see https://docs.unrealengine.com/latest/INT/Engine/Editor/VR/GDC2017/
New: Edit Sequences in VR
The Sequencer cinematics editor is now available in VR! You can create a new sequence and move objects around your level - and in the process automatically create sequence keys for their transforms. By scrubbing through time and setting these keys, you can create cinematic sequences and play them back, all inside VR. You can also open existing Level Sequences and play them back, either from the Sequencer UI or from the Radial Menu.
- New! Adjustable keys allow you to physically adjust your trajectories in the world!
- The Scrub Time option in the Radial Menu takes thumbstick or touchpad input as the speed with which to play your Sequence backwards and forwards. Press the trigger again to exit Scrub Time mode.
New: Physics Simulation in VR Mode
You can now simulate physics Actors in VR Mode using the motion controllers to interact with objects! Place Actors set to simulate physics and let the physical simulation run to get a realistic scattering or to knock Actors around with the motion controllers.
New: Smart Snapping in VR Mode
Smart snapping uses the bounds of your object to align to other Actors in the scene, enabling you to exactly fit them together without needing to build modular assets with a grid in mind.
This feature is currently only available in VR Mode, but we’ll add support for desktop editing in a future release.
New: Xbox One Renders with DirectX 12
DirectX 12 is now the default renderer for Xbox One! We’ve made a number of stability and performance improvements in DirectX 12. This allowed us to enable it as the default RHI bringing CPU and GPU performance improvements for titles developed for Xbox One.
Switching back to D3D11
Titles that need to switch back to D3D11 will need to do the following:
Modify bBuildForD3D12 in your title’s defaultengine.ini
Rebuild your game for Xbox One
Note: The D3D11 RHI will be deprecated in a future release.
New: HTML5 Support for WebAssembly and WebGL 2
Unreal Engine 4 now supports the new WebAssembly standard (also known as WASM) for HTML5, the fastest and most efficient way to compile and run C++ for the web! We are using Mozilla’s latest Emscripten toolchain (v1.37.9). This is a new technology and not supported on all browsers, so it is considered an Early Access feature, and requires GitHub access.
UE 4.16 also adds support for WebGL 2.0 which is based on OpenGL ES 3.0 and provides more optimal rendering performance, increased visual fidelity, and support for more rendering features, including:
- Most features of UE4’s high-end mobile feature level
- Instanced Geometry Drawing for particles and foliage
- Support for Multiple Render Targets (MRTs)
- Texture features such as 3D or volume textures, 2D array textures, and no more non-power-of-two texture restrictions
WASM and WebGL 2.0 are supported by Firefox 52 and Chrome 57 or later (64-bit recommended). Note there appears to be a bug in Chrome 58 on Windows that is causing out-of-memory errors in some cases. We are working with Google to get this issue resolved. Please see UE-44727 for the latest status on this issue.
You can enable WASM and WebGL 2.0 in the Emscripten section of the HTML5 Project Settings. If you require the broadest browser support possible, continue to use ASM.js and WebGL 1. Support for ASM.js and WebGL 1 will be deprecated in an upcoming engine release, and then removed afterwards (exact timing is dependent on additional browser support).
For a LIVE DEMO: try out Zen Garden on HTML5to see these benefits first-hand in your own Firefox or Chrome browser (supported versions listed above).
New: 2x Faster Garbage Collection
Garbage collection performance has been significantly improved and is now more than twice as fast! Specific improvements include:
- Reachability analysis multithreading has been redesigned to reduce the overhead of task management.
- Garbage Collection clustering now supports Blueprint-generated classes and selected Actor types.
- UObject unhashing code has been optimized to reduce the time spent destroying Actors.
New: Kinematic Bodies with Simulated Parents
We’ve added the ability to have kinematic physics bodies with simulated parents. You can now have child bones, such as the character’s hands, purely driven by animation data, while the parents of those bones can also be driven by physics simulation data.
This enables cool effects such as the player scaling a ledge reacting to falling rocks colliding with its body!
New: Platform SDK Upgrades
In every release, we update the engine to support the latest SDK releases from platform partners.
- Visual Studio: Important: Visual Studio 2013 is no longer supported on Windows with this release. Please upgrade to either Visual Studio 2015 or Visual Studio 2017.
- Nintendo Switch: supports Nintendo SDK 1.3.1
- Xbox One: Built against the October 2016 QFE3 XDK
- PlayStation 4: Upgraded to PSR SDK 4.508.001 SDK
- Android: Updated CodeWorks for Android 1R6u1
- GoogleVR: Updated plugin to version 1.3
- GoogleVR: SDK updated to 1.40.0
- GoogleVR: Mode default changed to Daydream & Cardboard
- Vulkan: Updated distributables and glslang to SDK 220.127.116.11
New: Sequencer Shot Track Enhancements
Shot Tracks in Sequencer gain several improvements for both cinematics and film creation!
- Hierarchical bias per shot: By default, tracks at a lower level in the level sequence hierarchy take precedence. This allows filmmakers to build a pipeline they’re accustomed to, where adjustments at the shot override tracks in the sequence they’re contained in.
- Exposed “When Finished” property for all tracks: This gives the user the ability to specify whether tracks should return values to their pre-animated state or keep them when the sequence finished. In a film production environment, you would typically want animated values in a shot to return to their pre-animated state so that they don’t bleed into the next shot. In a cinematic, you might want the value to persist so that you could continue into the game from the sequencer animated state.
- Pre/post roll: Pre and post roll is now a general concept for all tracks. Some tracks have specific behaviors, for example the camera cuts track will notify the streaming system with the upcoming camera cut position in the pre-roll evaluation period.
New: Animate Material Parameter Collections in Sequencer
You can now animate Material Parameter Collections in Sequencer giving you total control over animating scalar and vector parameters which can be referenced in any number of Materials. You no longer have to animate individual parameter values on each material instance in order to share animation.
New: Improved UI Rendering Performance
Games that use Invalidation Panels now have an option to cache only widget elements rather than render data, enabling them to benefit from much improved texture batching and significantly reduces draw calls. The result is a big performance boost on mobile devices!
On the Battle Breakers hero selection UI (shown above), each hero’s logical elements are cached but can also be batched together. The console variable Slate.CacheRenderData=0 enables this mode, which is now the default on mobile devices.
New: Improved Animation Pose Driver
We have made many improvements to the Pose Driver feature, which enables you to procedurally drive blend shapes or bones, by comparing the pose of a set of bones to a set of reference ‘targets’. This is particularly useful for areas like shoulders, where you may want to activate corrective morph targets depending on the pose of the upper arm and shoulder.
- You can now select multiple bones as ‘inputs’ to read a pose from
- You can now pick which bones should be modified by the node
- You can specify a ‘custom curve’ for how each target should be activated
- Can choose to drive curves (morphs, material) directly, instead of needing a Pose Asset
- UI is improved to allow creation/editing of target poses, bars to show target activation, etc.
- Target locations in viewport can now be clicked to select them
New: Opacity and Opacity Mask for Material Flattening
We have added support for baking out opacity (mask) values when using the Actor Merge Tool or Hierarchical LOD system. The resulting (instanced) material uses your configured blend mode to ensure it follows the correct render path. Here’s an example of a baked out masked material:
New: Improved Mesh Paint Tool
The mesh painting system has been overhauled to improve usability and clarity, and to allow for reusing the functionality in other parts of the Editor.
Also, the painting tools can now be used on skeletal meshes! Note that painting is not per-instance (as with static meshes) but is applied directly to the skeletal mesh asset(s).
New: Spline IK Solver
A spline IK node that is useful for controlling character spines or bone chains has been added to Animation Blueprints!
New: Detect Material on Mesh Surfaces
We’ve added a new ‘Get Material From Face Index’ function for Components which enables you to retrieve the Material applied to a Component after performing a (complex) Line Trace. This is supported for Static Meshes, Procedural Mesh Components, and BSP.
New: ‘Look At’ Animation Node Improvements
The Look At Location property of a Look At node can now be used relative to a bone or socket. Previously this value was ignored when you specified a bone or socket.
The visualization for Look At controls is also improved. For example, you can see the clamp angles, target location, interpolation and so on.
New: Animation Export Improvements
We added support for creating and exporting animations that include additional animation data generated from a post-process graph assigned to the Skeletal Mesh, such as Anim Dynamics for physics simulation.
To include this additional data, choose Preview Mesh from the Create Animation or Export Animation menus.
New: Unreal Audio Engine (Early Access Preview)
The new Unreal Audio Engine announced at GDC is available in early access on PC, Mac, iOS, Android, and Switch. It includes a cross-platform audio mixer with full backwards-compatible support for the existing audio engine feature set, including a new multiplatform EQ and Reverb master effects. In addition, the new Unreal Audio Engine introduces new features such as a submix graph, submix effects, source effects, real-time synthesis, and better audio plugin support.
The new Unreal Audio Engine is not enabled by default in 4.16 as there is continued work on implementing backends for console platforms, Linux, and HTML5, as well as stability and performance improvements, especially on mobile platforms.
To enable the audio mixer, use the command line argument “-audiomixer”.
Note: Most new Unreal Audio Engine features are hidden if you launch the editor without the audio mixer enabled.
New: Synthesis Plugin (Early Access)
The new synthesis plugin contains two new real-time synthesizers written using the new Unreal Audio Engine’s “SynthComponent” class to implement a fully-featured subtractive synthesizer as well as a real-time granulator. These new synthesizers are not only useful tools for procedural music and sound design, but they serve as an example of how third-party plugin manufactures and even sound designers might implement their own synthesis.
The synthesis plugin also contains a host of new DSP source and submix effects for use with the new Unreal Audio Engine:
- Source Effects: Stereo Delay, Bit Crusher, Dynamics Processor, Envelope Follower, EQ Filter, Virtual Analog Filter (Ladder/State Variable), Wave Shaper, Chorus, Phaser
- Submix Effects: Reverb, EQ, Dynamics Processor
New: Steam Audio (Early Access)
Epic and Valve have teamed up to release the first fully-integrated implementation of the Steam Audio SDK using the new capabilities of the new Unreal Audio Engine.
Steam Audio fundamentally integrates with the new Unreal Audio Engine’s spatialization, occlusion, and reverb systems to bring next-gen physics-based audio experiences to UE4 for VR. This is an early access version of Steam Audio with significant updates, more example projects, and workflow improvements planned for 4.17. Epic and Valve welcome any feedback, questions, or ideas for improvements.
See https://valvesoftware.github.io/steam-audio/for more information, documentation, and support help about Steam Audio.
New: Improved Color Grading Tool
The Color Grading user interface is improved to make it easier to use!
- A new HSV mode was added.
- You can now dynamically change the min/max value of the sliders depending on their type using Ctrl+Slider Drag.
- A new reset button was added to reset a whole color grading category. (i.e Global, Shadows, Midtones, Highlights)
New: Improved Animation Blend Space Editor
The Blend Space Editor now enables you to display Animation names for each sample using the Show Animation Names button inside of the grid. You can now also drag and drop animations on top of existing samples to replace them.
New: String Tables for Localization
UE4 now has support for localized String Tables!
String Tables provide a way to centralize your localized text into one (or several) known locations, and then reference the entries within a string table from other assets or code in a robust way that allows for easy re-use of localized text. String Tables can be defined in C++, loaded via CSV file, or created as an asset.
New: Animation Modifiers (Early Access Preview)
Animation Modifiers enable you to apply a sequence of actions to a given Animation Sequence or Skeleton, such as pin-pointing on which frame(s) the right foot is placed on the ground and adding Animation Sync Markers to the frames where the ball_r bone is at its lowest point (touching the floor).
A new set of functions to access specific animation data are available in the Animation Blueprint function library. Accessing and applying Animation Modifiers is done through a new tab which can be found in the Skeleton Editor and Animation Editor. Animation Modifiers can be added to a Skeleton or Animation Sequence. For Animation Sequences, the Animation Modifier is only applied to the sequence itself. When applied to a Skeleton, it is applied to all Animation Sequences which are based of the Skeleton.
New: Virtual Keyboards on Android (Early Access Preview)
Android now supports using the operating system’s virtual keyboard in place of the popup dialog input box!
Use of the virtual keyboard is enabled by checking the checkbox under Project Settings > Platforms > Android > APK Packaging. This option enables basic support for the virtual keyboard, but your application is responsible for ensuring input elements are visible and not obscured behind the virtual keyboard using the supplied OnVirtualKeyboardShown and OnVirtualKeyboardHidden event handlers.
Note: You may wish to disable the virtual keyboard with the Android.NewKeyboard console variable when the user is using a language requiring IME.
Support for Runtime Permissions on Android
Unreal Engine 4 now supports Runtime Permissions as required by Android 23 and later to access features requiring permissions categorized as dangerous by Google. These include access to contacts, photos, phone state, external storage, camera, and location services. See this webpage for details: https://developer.android.com/guide/topics/permissions/requesting.html.
If targeting Android 23, the Android Runtimes Permission plugin now provides the ability to check at runtime in native code or by using the Check Android Permission Blueprint node if a permission is already granted. If the permission has not yet been granted, the app may request it from the user using the Request Android Permissions Blueprint node and then get the result using an event bound to On Permissions Granted Dynamic Delegate. This allows permissions to be granted just before the game requires functionality needing a permission, improving the user experience. When targeting versions prior Android 23, permissions are granted by specifying them in the Android Manifest as usual.
Note: 4.16 requires Android SDK 23 or higher to be installed. If you don’t have this SDK level installed, you can find the CodeWorksforAndroid-1R6u1 installer in your Engine/Extras/AndroidWorks directory. Also, under “Project Settings”, â€œAndroid SDKâ€, please change your Android SDK API Level from â€œmatchndkâ€ to â€œlatestâ€. This will ensure UE4 will use the newest installed SDK found in your Android SDK platforms directory. There is no need to change the NDK API Level; â€œandroid-19â€ is correct to allow installing your APK on Android versions prior to Lollipop (Android 5.0); setting this higher will cause your app to require Android 5.0+.
Shader Code Library to Reduce Package Size
You can now enable a shared storage location for all shader code using the Share Material Shader Code project setting, resulting in a single copy being stored for Materials or Material Instances that generate the same shader code.
Some platforms such as Metal on iOS, TVOS and MacOS support a more efficient platform-specific shader library. Enabling the Shared Material Native Libraries project setting will further reduce the package size by utilizing this native library format.
New: Import Capsule Collision from FBX
You can now import capsule simple collision from an FBX file, in the same way you could already import box, sphere, and convex simple collision. You now use the ‘UCP’ prefix on a capsule poly mesh, and it will be removed on import, and replaced by a corresponding capsule collision shape.
New: Separate Options for Shared vs Local Asset Viewer Profiles
Unreal Editor now enables you to store asset viewer profiles on a shared or local level, making it possible for teams to have a shared set of profiles which can be used as a unified scene to assess art assets. Storing profiles at a local level ensures that a user can still have a custom set of profiles he prefers to use local but are not required by the team. Shared profiles are stored in DefaultEditor.ini and will require you to check out or make it writable.
New: Improved Animation Preview Scenes
We made several improvements to preview scenes for the Animation Tools:
- Preview scene settings have been moved to the existing settings tab, rather than in a hidden menu in the viewport. This settings tab is now shown by default.
- Added a shortcut to quickly switch preview mesh to the main toolbar. This applies to all animation editors.
- When editing preview scenes, you no longer have to create a “preview scene collection” asset just to preview extra meshes. If you are happy with your mesh setup you can now optionally save it to an asset.
New: Add Default Camera options to Anim Viewer
You can now save a ‘Default Camera’ position for a Skeletal Mesh. This is used when opening the mesh, and can also be jumped do by pressing Shift+F.
New: Play Montage Blueprint Node
Play Montage is a new asynchronous node which can be used in any Blueprint logic to play Anim Montages. It provides easy access to some callback events, letting you trigger other nodes when a montage blends out, is interrupted, etc…
- OnCompleted is called when the Montage finishes playing and it fully blended out.
- OnBlendOut is called when the Montage is starting to Blend Out, either because it’s stopping automatically or manually.
- OnInterrupted is called if the Montage is starting to Blend Out, but because it’s been interrupted by another Montage playing.
- OnNotifyBegin and OnNotifyEnd are callbacks when using either ‘Play Montage Notify’ or ‘Play Montage Notify Window’ Anim Notifies in the Montage asset. These AnimNotifies can forward an additional ‘Notify Name’ to differentiate between multiple callbacks from the same Montage.
New: Added Options For Retargeting Poses
You can now import a pose from a Pose Asset to use when setting the Retarget Base Pose. Previous options of modifying and saving the current pose (Use Current Pose) and of resetting bone transform to reference pose (Reset) are still available as well.
Note: You can create Pose Asset in the animation editor and insert any pose to the pose asset with a name assigned.
New: Collision View Option In Static Mesh Editor
There are now separate options for viewing Simple and Complex Collision for a StaticMesh in the StaticMesh Editor tool.
New: Baked Poses in LODs
Unreal Engine 4 now supports baking a pose into a LOD level using a new reduction setting called Bake Pose. This can be set to a single frame anim sequence which will be applied to the resulting LOD mesh. This can prove useful when removing bones and still wanting to retain a pose.
Note: This feature requires Simplygon
New: Sequencer User Interface Improvements
Thumbnails for Audio Tracks now render the peak samples with an inner (smoothed) RMS curve. Audio Tracks can also be resized vertically!
Other Sequencer UI Improvements:
- Sequencer controlled actors will now tick properly when streamed into a level.
- You can now specify additional event receivers (ie. actor blueprints) for the event track.
- Bindings improvements. You can now drag/drop/set bindings for the level sequence in blueprints. For example, you can spawn an object in blueprints and assign that to an existing track.
New: Mobile VR Rendering Improvements
Direct multiview is now supported for Samsung Gear VR, removing an extra render target and full screen copy when using mobile multiview which improves overall performance and reduces memory usage.
Monoscopic far field can now be used with multiview enabled on Gear VR ensuring the stereo rendering portion of your scene is rendered optimally.
Google Daydream supports standard mobile multiview with direct support coming in a future release.
New: Social Screens for PlayStation VR (Early Access Preview)
PSVR Social Screen preview provides support for Social Screen separate mode, where the the Monitor and HMD display different things.
This preview supports 30fps Social Screen output, and switching between several output modes. The following sample modes are implemented:
- SystemMirror (this is the default mirror mode that has always existed).
- SeparateTest (simply alternates between black and white on the social screen).
- SeparateTexture (displays a blueprint specified texture, for example a render target beting written to by a scene capture component).
- SeparateMirror (displays the full vr render buffer)
Future work will include optimization, a multi-platform interface to these features, possibly support for 60fps mode (which requires system dialogs to resolve conflicts with certain system features).
The new PSVR project setting bEnableSocialScreenSeparateMode must be set to true to use this feature. When that is true additional screen buffers will be allocated for the social screen. Blueprint functions for controlling the feature can be found by searching for ‘SocialScreen’.
New: Android Executable Size Reduction
We have made a number of optimizations to the compiler and linker settings to reduce the size of the Android binary executable. Checking the Build with hidden symbol visibility option allows the linker to more aggressively remove unused code from Unreal Engine when generating the Android executable. This also strips the function symbols from the symbol table further reducing the executable size. We are seeing reductions of around 7MB from the final APK.
Note: This option removes symbols from the binary on the device, so a native crash call stack will appear in the logcat output without any symbols. To facilitate debugging, the build system will also copy an unstripped binary with debug symbols to the output directory, and also generate a batch file that adds symbols to a call stack.
New: Vertex Interpolator Material Expression Node
Vertex Interpolator nodes have been added to the Material graph offering better control for value interpolation between vertex and pixel work. These are intended as a workflow improvement, there are no changes to interpolator limits nor will shaders change.
The existing workflow for offloading work to a vertex shader is by making use of the Customized UV outputs. This can be a little cumbersome and involves manually packing your data. The example material below packs the pre-skinned mesh data then unpacks it for use in an effect:
The new interpolator node handles packing automatically, allowing the graph to be simplified and in-lined:
Work that would previously be packed through Customized UVs is hooked up to the VS (vertex shader) pin and retrieved from the PS (pixel shader) pin.
The material stats output has been updated to show the current interpolator usage, both currently packed and available maximum. Note how in the above examples the instruction counts and interpolator usage remain constant. The stats show 2 scalars are reserved by the TexCoord node and the remaining 6 by our pre-skin data, giving a total of 8 scalars packed across 2 vectors.
The feature is compatible with Customized UVs and will pack results together.
New: Asset Management Framework (Early Access Preview)
The Asset Manager is a new global object that can be used to discover, load, and audit maps and game-specific asset types in the editor or at runtime. It provides a framework to make it easier to create things like quests, weapons, or heroes and load them on demand. It is is still under active development, and these features will not be ready to be used by Blueprint-only games or inexperienced developers until 4.17. The Asset Manager tab in Project Settings can be used to set up the rules for your game:
Primary Asset Types that are scanned by the Asset Manager can be queried at runtime before they are loaded, and can then be asynchronously loaded on demand. Also, the Asset Manager settings can be used to set up cook and chunk rules when packaging and releasing a game. In progress documentation for this feature is available on AnswerHub: https://answers.unrealengine.com/questions/595580/what-is-the-asset-manager.html
New: Asset Audit Window (Early Access Preview)
Built on top of the Asset Management Framework, the Asset Audit window can be used to audit disk size, memory usage, and general asset properties for many assets at once. It is a specialized version of the Content Browser, and can be accessed from the Window > Developer Tools menu, or from the right-click menu in the Content Browser or Reference Viewer. Once you have opened the window, assets can be added using the buttons, and platform data loaded out of cooked asset registries can be loaded using the platform drop down. Here’s an example of auditing textures from the Shooter Game sample on PS4:
VR: Unified Console Commands for VR
We’ve consolidated and unified console commands across VR platforms to create a shared layer that developers can work from rather than maintaining each platform independently.
This provides several benefits:
- Bootstrapping new platforms is easier.
- Argument meanings across HMDs are more consistent.
- Current HMD implementations have less redundancies.
- All VR-related console commands share a common prefix, “vr.”. Vendor-specific extensions are clearly marked as such in the command name itself.
- The updated console commands support tab completion and inline usage help texts.
During a transition period, the old console commands will still be recognized, but will issue a deprecation warning when used.
Custom Hardware Cursors
Platform native custom hardware cursors are now supported on Windows, Mac and Linux! You can set up the hardware cursors to use in the UserInterface settings for the project.
The system allows you to provide multiple formats for the cursors. For instance, in the settings you could specify you want the “Slate/FancyPointer” to be your default pointer. In the Slate directory of your game’s content folder, you could have a FancyPointer.png + .cur + tiff, to cover the multiresolution capabilities of certain platforms. The tiff would be loaded on Mac, the cur/ani file would be used on Windows, and in case a platform specific format is not found that we support, we look for a png.
UMG widget creation is now up to 3 times faster! When cooking, the widget compiler now generates a widget template/archetype that is stored in the same package as the generated Blueprint Class and used during widget creation.
During compiling, we generate a nearly fully initialized widget tree including all sub User Widgets and their trees, hookup all member variables, initialize named slots, setup any animations, etc. This nearly fully constructed widget can be instanced using it as an archetype, and does not have to use the correspondingly slow StaticDuplicateObject path.
There are restrictions on this method, part of the compiling step for widgets now inspects if the instancing would be successful, or if there would be GLEO references after instancing because a user forgot to setup Instanced on a subobject property. Luckily that should be few and far between since all UVisuals (Widgets & Slots) are now DefaultToInstanced, which takes care of the overwhelming cases that demand the instanced flag. Especially given the bulk of cases using BindWidget in native code.
READ THE FULL RELEASE NOTES HERE**