Who We Are
Alpha Team is an internal Epic group operating as first customer for FNE deliverables. We are using the tools to build prototypes and small game experiences, exploring new features in imaginative and creative ways to surface and resolve any breakages, friction points and user experience issues before the Creator community encounters them.
We will be posting development walkthroughs of some of the projects we have worked on outlining the features we explored, how we developed the project, and what we discovered throughout development.
Let us know your thoughts and please post your questions in the thread.
Project Summary
What: An exploration into camera transitions, controls and the newly expanded camera toolkit
Theme: Escape the wizardâs tower by searching the room to learn the recipe for a potion to unlock the door. Gather the ingredients you need and place them in the cauldron to brew it!
Initial goals for exploration
- Test the workflow around camera transitions for newly released cameras (fixed, perspective)
- Write our own camera control system using Verse to see how well we were able to work with the API
- Experiment with how this camera setup would pair with a hidden object/search mechanic for an escape room game
- Understand the workflow behind creating a distinct visual style for the environment compared to core Fortnite, and learn where any current pain points are in pushing those boundaries
- Testing out new mini-map functionality
- Creating a lite game loop
- HUD/UI work
Dev Walkthrough
Camera Transition Volumes
Mutator zones and Volumes are a simple and easy way to handle camera transitions, and formed a solid camera system which worked well for this experience.
At the core of this workflow is knowing which Camera/Cameras are âaddedâ to the player, and which one has priority. In this experience we added the main zoomed out camera to the player on start only (âAdd to players on startâ checkbox). Then every other camera was added to the player on entering a volume and removed when exiting the volume (mutator zone or volume device). When two cameras have the same priority, the one most recently added takes priority, so in this case we did not need to configure priority for the needs of this game (priority could be left at 0 for all cameras). In general when moving into a volume the camera set up is simple and robust - you can have any number of volumes within volumes and reliably transition to and from a camera.
There are instances where the prebuilt volume shapes donât offer the shape control required. In this case you can use a gate like system where a camera is added in one mutator zone and removed using another. Another option is to use the Volume device and use a custom mesh. We tested these approaches on the stairs and both worked well for our needs.
One thing worth noting is that when using a Volume device with a custom mesh, itâs the simple collider that will define the volume - not the mesh shape. Complex collision will only respect the surfaces of the mesh, and not be treated as a volume. We have fed-back to the team that the messaging here needs to be improved and ideally we should see the collider in the viewport preview rather than the mesh.
Verse Camera Proximity Solution
We placed a number of âcamera objectsâ in the scene and wrote a script to evaluate which of these âcamera objectsâ the player was in closest proximity to.
Then, within the editor we were able to configure which camera to add or remove from the camera stack depending on which object(s) the player was closest to, allowing the different camera devices to handle the switching transitions themselves.
To ensure that the game wouldnât begin to transition to another camera too soon, we âgatedâ areas out with multiple duplicates of camera objects. This allowed us to keep the player within a particular camera by ensuring they were within the imaginary boundary of that group of camera objects correctly.
Sometimes, the easiest method of ensuring players were within a boundary was to âmix and matchâ with the mutator zone approach above, and allow this to override the proximity approach. Thus, you can see some mutators in the setup above, where this made more sense than a more dynamic approach.
Thoughts on this approach:
If you wanted to control the adding/removal of cameras from the stack according to your own gameplay conditions such as navigation in an adventure game, an approach similar to this based upon proximity to an object could be a flexible option for you.
The most challenging thing about this approach was being able to visualise the âpointsâ at which the camera would switch because there was not an exact visualisation of where this would occur within the editor itself. This meant that there was a fair amount of trial and error in getting the exact results we wanted.
The inability to control/override camera transitions currently within Verse was additionally a limiting factor, which produced inconsistent results.
Game Design and Interactivity
Goal: to create a simple key and lock system to support a search/escape the room style game, supported by a basic inventory HUD and a custom UI widget.
Key and Lock System
We created a new Verse script that defines a new type of pickup object class we can use to look up the type of pickup object found by the player and share common functionality across all pickup objects in the scene.
We then established some rules for what happens when a player picks up an object: in this case, the physical object disappears, its interactive button is disabled and the âkeyâ is added to the playerâs inventory via text to show them that they now own this object.
Additionally, we established some rules for what happens when a player uses a particular âkeyâ on the correct lock (assigned in the editor itself as shown in this screenshot): in this case, the item is removed from the inventory and another action is triggered such as playing a particular cinematic.
Itâs worth noting that although weâve called this a âkey and lockâ system, the objects in question donât always have to be keys and locks, although there is an example of this set up in our scene. We use the same system to deliver the potion brewing section of the game, with multiple âkeysâ ultimately âopeningâ the conditional lock of the final potion being brewed, allowing the player to escape!
Itâs quite a simple but effective system that underpins the entire functionality of this narrative experience.
UI and HUD work
The other side of the interactivity is to guide the player through the experience with some UI and UX feedback. In this experience, this is largely delivered through:
- Button Device (prompts the player to interact with something in the environment)
- HUDMessage Device (triggered on button press, either displaying text or a bespoke UI Widget)
- Sequencer Device (triggers some animation within the scene and potentially a camera change)
Common across many narrative games, when the player interacts with the environment they must receive a satisfying consequence for having done so which clearly communicates the result of their action and/or gives them a new direction or prompt to follow.
As such, we follow this Action â Response pattern throughout the experience, even if the consequence differs slightly across different situations:
Example: reading the spellbook shows the spellbook large on screen; a reward for finding the book.
Example: unlocking the door uses a key and triggers a notification to let the player know what theyâve achieved
Example: picking up the toenails yields an amusing response in feedback
Lastly, the playerâs inventory is displayed on the side of the screen at all times as a clear reference of what they currently have in their possession, and a marker for what else they must find. This is updated via the key and lock system above.
This is a relatively basic presentation delivered via text alone, but does the job of communicating to the player as a first pass. You could build upon this solution further by displaying your own UI on screen with a UI Widget, and layering on inventory images/thumbnails of each of the pickup items on top.
Level Art Overview
The process of creating the artwork for Wizards was organic, with ideas and visual development happening alongside the 3d creation stage. This is atypical in larger game development, but is fine for smaller fun projects. That being said, we started with some initial anchor points to begin the process:
- Environment to be a smallish diorama consisting of a few different areas. Ideally non square rooms, to help pull the aesthetic away from library modular kits and make the room shape more interesting
- Ceiling and nearside walls cutaway to allow us to peer into the scene and support new camera options
- Art style to support resource and time frame (one 3d artist, one TA/VFX artist for around 2 weeks)
- Art style to be flexible enough that certain library props could be used to speed up creation
The general flow of the art development went roughly as follows:
- Simple block out from design made using UEFN library kits
- Art tests, room cutaway and style tests - zoning in on an illustrative style
- Art rebuild, evolution of the layout and align with design on space
- Quick paintover to give rough impression of art direction
- Model and texture elements in Blender and Substance Designer
-
Scene assembly in UEFN - placing meshes, cameras and collision. Art passes as functional.
-
Shaders and material creation
- Lighting and Postprocessing
The following sections give a high level overview of certain parts of the creation process. Please feel free to ask questions and we can go into more detail about any of these aspects.
Non-Modular Approach
We built the environment as a few large sections, rather than following a modular approach which is more common in game development. This has pros and cons, but the major pro and driving force behind the decision is that itâs quicker to get started. Modular tilesets become more efficient on larger projects, where the initial time investment is regained by the ease of building out new spaces or making level edits more flexible without the need for edits to existing geometry. Modular tiles are also more memory efficient and work better with different culling methods. However, in this case the extents of the space were decided in the blockout stage, the level would need to rendered in its entirety from the 1st camera, and from a speed of creation point of view its quick to remodel the rooms over the blockout and not get bogged down with the intricacies of building a flexible tileset.
One of the cons of this method is related to collision. When you build out a level in this way, itâs unlikely that youâll get decent collision from complex meshes like this. This means needing to do a collision pass, either setting up simple colliders across the level, or building a low poly proxy mesh and using complex collision. We opted for the latter for quick results.
A byproduct of this collision method is that the Boom Collision option for the fixed angle camera, which is used to cutaway geometry which obscures the player relies on collision and swapping materials for any mesh which the player goes behind. In our section where the player goes underneath the walkway to the upper floor, we wanted to test Boom Collision but quickly realised that weâd need a subsection with separate collision. We therefore separated the walkway to its own mesh and collider, so this section would work properly with boom collision.
Mesh style
Another thing of note is that we leaned on nanite to allow us to use mid poly level objects directly in scene, rather than going through a more time consuming high poly > low poly workflow that we would have used traditionally. This was done in part for speed and ease of creation, but it also helped push the art style, giving the surfaces a chunky, bold look, while working really well with the outline post processing to help drive the illustrative quality.
Process overview:
- Substance Designer to make the main tiling textures. Specifically the height map/displacement map
- Displaced high res planes in Blender using the height maps from Designer
- Decimated the geo in Blender, and set appropriate smoothing angle to give a nice blend of facets vs smooth areas
- Cleanup, UVs, array and weld, Boolean out cutaway tiles on nearside
-
Trim pieces were quickly sculpted in Blender, then decimated
-
Drapes were done using cloth sim in Blender
Lighting
Lighting started with disabling the Time of Day Manager in World Settings. Starting from full darkness, we add a Skylight to fill the scene with a low level ambient light. Shadow casting meshes were added to the ceiling to stop too much light coming in from above which made the scene feel out in the open rather than peering into an interior.
Point lights were mainly used as fill lights throughout the scene, to illuminate different parts of the environment with colour and atmosphere.
Spotlights were used to shine light in through the windows and create interesting light shapes on the floor. Lightbeam meshes were made and positioned to fake a sense of air particulate, this looked more stylised and was easier to localise than using global fog.
Post processing was used for the outline and hatching effect.
The character was masked out of the hatching effect by setting all static meshes to Render CustomDepth Pass using a Stencil value of 1. The post process material then uses this stencil to mask which items will receive hatching. To make this easier, this was done by using the static mesh filter on the outliner to select all static meshes, then right click and open the Edit Selection in Property matrix to set the values on all meshes in one go.
A scalability manager was used to tune the lighting settings across different scalability presets.
The end result is a visually pleasing space with a wizardy vibe, it works to support the camera style within the context of a puzzle / adventure game which is localised to a relatively small area. The 3d environment fits the needs of the project and was created with a low time investment, showing that UEFN has the tools required to make custom art work well in low resource environments.