I am trying to use Audio 2 Face and PixelStreaming in my project, but the following problem has arisen: when viewed in a browser, the audio stream from Audio2Face via PixelStreaming does not play. At the same time, the sound is played in the editor. The A2F documentation says that the audio from LiveLink is played via SubmixListener - how to intercept it for PixelStreaming?
Any advice? Does sombody use it with PS?
any luck with this? i am at the same problem.
I think it’s your answer in A2F forum about use to play same audio directly with some delay. I’m afraid this is the only solution so far, but I hope it’s temporary.
Did you make a build with MetaHuman? Does it work properly? My compiled build crashes or exiting without any exit messages at log…
yes its built and working right now, but android cant type ( iphone and browser can) and i can only get the signal server html to play not using the new frontend typescript.
how bunch of other issues related to pixel streaming but thats ok, this is just a test project for fun.
I dont like how every user has the same input, i know there’s some things u can do but not worth it. i also hear you can only make 3 instances , so having multiples is a issue with pixel streaming.
at this point its obviously best to find a different method for what i want , buts its the principal at this point so i wanted to at least get to a good point where I’m happy with.
Virtual Meta-human connected to ChatGPT , lip-synced with audio2face on the web.
Hi @flashASA
It sounds like a tricky situation, but there’s a couple of suggestions that might help resolve this and understand the problem:
Check Audio Output Routing: Ensure that A2F’s audio, captured through SubmixListener, is correctly routed to your game’s main audio output.
- Create an Audio Submix: In Unreal Engine, create an audio submix dedicated to handling audio from A2F. This submix acts as a channel to direct the audio to the game’s main audio output.
- Configure SubmixListener: Adjust the settings in A2F’s SubmixListener to output to this new submix. This ensures the A2F audio is directed correctly.
- Verify Routing: Check the audio settings in Unreal Engine to ensure that your new submix is properly routed to the main audio mix.
Debug Audio Path:
- Enable Audio Debugging: Use console commands like
au.Debug.Sounds 1
orau.Debug.SoundMixes 1
to activate audio debugging. - Visualize Audio Path: Use commands like
au.3dVisualize.ActiveSounds 1
to enable a visual mode for active sounds and understand the flow of audio. - Check Sound Mixes and Modulations: Inspect active sound class mixes for any issues.
- Examine Active Sounds: Use
au.DumpActiveSounds
to list all currently playing sounds. - Analyze Memory Usage: Use
AudioMemReport
for a detailed report on the memory usage of active sound objects.
These steps might help you pinpoint and resolve the audio streaming issue.
Additionally, for your next creative project, let me introduce Vagon Streams, an innovative streaming technology that could streamline your game development and pixel streaming process.
At Vagon Streams, we offer effortless streaming for applications on any device, with no code configuration. This means you can make any application globally accessible without complex configurations. Just activate the plugin, and you’re ready to stream!
You don’t just pixel stream on Vagon; you get a powerful cloud-based solution that adapts to your users’ network conditions, ensuring a no-latency experience and instant launch for up to 10,000 concurrent users. Plus, you can stream your application in 4K & 60 FPS from more than 25 regions globally with just one click.
If you would like to do a test drive, please let me know. Or, you can create your account and easily start your first Stream anytime you want.
This temporal tip was found on NVidia Audio2Face forum:
Open Plugins/ACEUnrealPlugin-5.3/ACE/Source/OmniverseLiveLink/Private/OmniverseSubmixListener.cpp
from the plugins for your UE app.
Find the code:
FAudioDeviceParams MainDeviceParams;
MainDeviceParams.Scope = EAudioDeviceScope::Shared;
MainDeviceParams.bIsNonRealtime = false;
//MainDeviceParams.AssociatedWorld = GWorld;
//MainDeviceParams.AudioModule = AudioModule;
AudioDeviceHandle = AudioDeviceManager->RequestAudioDevice(MainDeviceParams);
Need to comment out 2 lines like there and compile plugin.
It working for audio stream sample rate 48kHz.
Hi there!
I know this is coming late to the discussion, but an important thing to note about Pixel Streaming audio is that all audio for PS is captured from the engines default audio submix.
Thus, if you’re attempting to play audio through any other mechanism that does not go through the default submix, it will not be streamed.