Tutorial: MetaHumans for Dynamic Gameplay and Real-Time Cinematics in UEFN

Hey Fortnite learners! This hands-on session covered the full pipeline — from scanning your face to getting a playable MetaHuman NPC with motion-captured animations running in UEFN, plus an intro to Verse coding. Everything is 100% beginner-friendly!

:video_camera: Course Modules

Module 1 — Quick Start: Setting Up Your First MetaHuman Project in UEFN Working with verse, Scanning tips (stabilization, spiral vs. orbit camera patterns), importing your scan into MetaHuman Creator, and getting your first MetaHuman into a UEFN project.

Module 2 — Using MetaHumans in UEFN: Import, Retarget & Gameplay Step-by-step guide to importing MetaHumans, working with different skeletons, IK retargeting between Mocopi/Fortnite/MetaHuman rigs, and setting up the Guard component for instant NPC gameplay.

Module 3 — Motion Capture for Cinematic Animation with MOCAPI Live demo of capturing motion data with Sony Mocopi sensors, retargeting animations onto MetaHumans, and building cinematic sequences in UEFN using Level Sequencer.

Module 4 — Sequencer, Constraints and Working with audio in UEFN and Metahumans.


:link: Resources

All lab materials, project files, and additional references are available on the class Trello board: :backhand_index_pointing_right: UEFN for MetaHumans — Class Resources

UEFN Templates used: Animation, Talisman: MetaHuman, Talisman: Environment, Stand Up