Hi,
We’ve been looking at the UE5 Screen Reader plug-in to assess whether it would be appropriate for our project (using UMG and still in a fairly early stage of development).
The relevant documentation we’ve found is here:
https://dev.epicgames.com/documentation/en-us/unreal-engine/supporting-screen-readers-in-unreal-engine
https://dev.epicgames.com/documentation/en-us/unreal-engine/blind-accessibility-features-overview-in-unreal-engine
However what I’m finding confusing is that the documentation talks about using NVDA or other external screen reader software, but doesn’t give details on how to set this up.
The SlateScreenReader only triggers announcements when the focused widget changes, and for most platforms it looks set up to do that using the Flite implementation of a FTextToSpeechBase which will synthesize audio using Flite and send it to the UE audio system. I.e. All the logic from UI triggering through to playing audio is within Unreal Engine.
Is there another mechanism or type of screen reader logic that can talk to an external screen reader? If so, can you point me to the relevant code?
The ScreenReader and SlateScreenReader plugins are experimental.
Is there any road map for the development of these plugin, i.e. what features we might expect in future and when?
Many thanks,
Dave
Hi,
It’s best to think of screen reader support as involving three separate parts- an accessible application (with an API for extracting info), a screen reader (for parsing that info based on user interactions), and a TTS engine (for reading the result out loud). All of these things are supported within Unreal Engine to varying degrees:
- Unreal applications can be made accessible via Accessibility.Enable. All of that logic is native to the engine and doesn’t require any specific plugins. This would be enough for a third party application (such as NVDA) to extract information from an Unreal Engine application and read it aloud
- The ScreenReader and SlateScreenReader plugins are screen reader applications built directly into the engine that aim to provide similar functionality to third party screen readers. The benefit here is that they’re platform-agnostic and more deeply integrated with the engine, but the downside is that they aren’t as fully-featured as third party solutions. Additionally, people with disabilities may prefer a specific third party solution that meets their specific needs.
- The TTS plugin is a fairly barebones implementation of Flite, and gives our screen reader plugin TTS capabilities. You could also enable this plugin independently and use it however you like, such as manually feeding it strings when you want to vocalize specific things
In terms of roadmap, none of these features are currently being actively developed, so depending on your needs it may be more of a starting point then a full out-of-the-box solution. If you encounter any issues with the accessible API native to the engine then we’ll definitely want to take a look (since that’s required for third party screen readers to function properly), but our built-in screen reader plugin is currently offered as-is. If you want to experiment with a third-party screen reader such as NVDA, it should be as simple as downloading/launching it and setting Accessibility.Enable to 1. You may need to launch in standalone mode to get it working (the functionality within the editor is pretty limited), but your screen reader should be able to pick up strings from your Slate widgets and read them aloud.
Best,
Cody
Great - that helps clarify things a lot, and confirms my suspicions on the implementation (might be worth updating the documentation on those areas I linked to).
I hadn’t been able to get NVDA working yet, but that could be due to not using a Standalone build - will try again.
Additional question - is there external screen reader functionality on consoles (notably PS5 and Xbox) that can interface with the native accessible API?
Or would screen reader functionality on console only be able to be implemented using a reader that is build into the engine?
Many thanks,
Dave
Hi,
Here’s the full list of steps I took to get NVDA working with Lyra:
- Launch NVDA
- Launch the Lyra project
- Start PIE in standalone mode
- Run “Accessibility.Enable 1” in the console
- Pause the game and mouse over buttons, NVDA reads out the button text
It seems to also do decently well within the editor itself once enabling accessibility, though in both cases there’s a bit of a delay. There are some tips on [this [Content removed] that might help with the latency, if you run into that problem.
For native console screen reader support, I’m not sure off the top of my head but we’d probably need to follow up on that within the platform-specific forums. I’d recommend opening a new ticket tagged with the specific platform so we can pull in the right people.
Best,
Cody
Thanks, Cody - yeah, I was already able to get NVDA working with Lyra, which is great!
I have had mixed results with our own project though, trying across various types of build: PIE, Standalone PIE, and Standalone cooked game build (both on the fly and by the book). And I’ve found inconsistent results across runs even with the same build type.
In the cases where I’m not hearing any audio we appear to be queuing the accessible tasks, etc. but I haven’t figured out where it’s failing yet. I’ve tried playing with that ACCESSIBILITY_DEBUG_RESPONSIVENESS define too, but I haven’t had narration working with it enabled yet. But I can’t say that it flat out doesn’t work because of the inconsistency I’ve already observed!
When I have time, I’ll keep digging and try to find a way to get it consistently working - I don’t think I need anything further for you guys now, as Lyra is clearly working consistently, even in Editor.
If I narrow my issues down to an engine problem, I’ll let you know!
Many thanks,
Dave