Fortnite has set a strong standard in visual accessibility, especially for players with hearing impairments, by translating important audio cues into clear visuals.
A possible next step could be to expose key HUD and gameplay signals as accessible events that can be consumed by other output channels, for example spatial haptics. This could help reduce visual overload and support players who benefit from shifting some information away from constant on screen scanning. Example signals could include directional damage, footsteps direction or intensity, ammo or reload thresholds, shield and health thresholds, storm timing cues, interaction prompts, and similar.
I would love feedback from the community on what a practical top down approach could look like for Unreal Editor for Fortnite. Specifically:
-
Which events would be most valuable to expose first for accessibility and clarity
-
What existing event hooks are already available in UEFN, and what is missing for this kind of use case
-
What would be the recommended way to emit standardized, rate limited events that external tools or hardware could subscribe to
-
Are there best practices for keeping this safe and fair, for example avoiding competitive advantages while still enabling accessibility focused outputs
Any pointers to relevant documentation, existing patterns, or Epic guidance would be appreciated.