Connecting to a natural language understanding engine for leukemia information assistant

HI all. Completely new to Metahuman and very impressed. I have a project in mind that will include a natural language understanding engine trained by medical professionals to provide leukemia information and support.

I am wondering if it is possible yet to have the metahuman output information from the NLU. There will be speech recognition that captures the user input, sends the intent to the NLU via HTTP request and the response will be returned in JSON format. From here I would like to know if it is possible for the metahuman to speak the response, I will be using Unity and/or Unreal Engine.

Thanks in advance.

Many cloud platforms support speech recognition. IBM provided a demo unity project I used a few years ago that made it easy. It’s possible there is something similar to this for unreal. A search shows a number of options.

Once your system handles that and provides back a text or json response, there are as well a number of plugins - free or marketplace to change test to speech.
GitHub - yar3333/text-to-speech-ue4: Text-to-speech UE4 plugin for Windows (SAPI). Tested on UE 4.21

Here’s a few Youtube covering part of this with metahumans

Note that you need to use Unreal for rendering metahumans due to licensing restrictions. (i.e. not Unity)