Currently using live link and a live link source type of Meta Human(Audio) to feed audio to our MetaHuman. The problem that we’re facing is that the live link source only works on whichever box you package it on. There doesn’t seem to be a way for the live link source to dynamically create a source with the default audio device when running on a different box.
I understand that there has been a private email thread with the biz dev or partnerships team which may have already answered the questions on this thread.
The full release of Unreal Engine 5.7 will include some Blueprint nodes that will enable you to configure the MetaHuman Live Link sources and subjects. We believe these would begin to unlock the capabilities that you require, particularly if you are using a standard input device (such as a microphone or WAV file), although potentially still need some integration work on your part as in-game use is not yet directly supported.
Unfortunately, these Blueprint nodes were not part of the 5.7 Preview release in September.
However, if the input device is more complicated, such as data being streamed over the network or generated by AI, then the best solution will be to use the MetaHuman Animator realtime C++ API to create your own custom Live Link Source that can then be created and controlled in game from a Blueprint.
From the email thread, I understand that you are looking for MetaHuman Animator support on Linux as part of this solution. Currently, these capabilities are only available on Windows although we are actively looking at how to bring them to Linux and macOS, however that will not be part of the 5.7 release cycle.
Do you plan to integrate audio streams generated by AI services, such as Text-to-Speech, in the near future? Also, is there any chance that MetaHuman Animator will be supported on Linux in version 5.8?