Windows 10
Unreal Engine 5.1. from Epic Games Launcher
Create new Game → First Person template.
Adding a simplest blueprint with Audio Capture → Start (on BeginPlay) to the level.
When I start the game after that locally I can hear myself. Microphone recording is being reproduced as Sound in game.
Engine → Input → Always Show Touch Interface = checked
Engine Input → Default Cursor Class Name = None
Engine Input → Capture Mouse on Launch = unchecked
Packaging for Windows in Development configuration
In the packaged folder launching \Windows\PS_Mic_Test_5_1\Samples\PixelStreaming\WebServers\get_ps_servers.bat
Switching in config \Windows\PS_Mic_Test_5_1\Samples\PixelStreaming\WebServers\SignallingWebServer\config.json
‘“UseHTTPS”: false’ on ‘“UseHTTPS”: true’
adding seld-signed (openssl) ssl certs into \Windows\PS_Mic_Test_5_1\Samples\PixelStreaming\WebServers\SignallingWebServer\certificates\
Signalling server runs correctly on 443 with https enabled.
running the game
\Windows\PS_Mic_Test_5_1.exe -PixelStreamingPort=8888 -PixelStreamingIP=127.0.0.1 -renderOffscreen
And after connecting from web and setting Microphone parameter “On” (?useMic=true)
Web page asks for acces to the Microphone. But when I play via web browser, I cannot hear myself talking.
I can here shots in the game, but microphone looks not being transferred from the web browser into the game.
Thank your for your inquiry, hopefully my answers can clarify a few things.
The Audio Component you are using in the blueprint is not related to the microphone component of Pixel Streaming. To be able to pass the microphone data from the stream back to UE, you need to add the Pixel Streaming Audio component to your scene. You can essentially attach it to any asset or actor in the scene as per below:
Once you’ve done that, you can adjust its settings, say, to pick a particular player to listen to, but the default configuration will work out of the box, where it will default to the first peer it can hear:
You are ready to roll. Start streaming your application, enable the microphone in the settings panel by toggling the Use microphone setting, and then restart the stream; you should now be able to hear your own playback.
Note, that all this component does is pass the audio data back from Pixel Streaming to UE (which is why you can hear yourself). If you need to further pass this audio stream to another plugin or utilize it for any other purpose, you will need to customize the application.
So is it enough to use AudiocCaputure with PixelStreamingAudioComponent or do I need to implement my own version of PixelStremingAudioComponent to record an audio file using the pixel streaming audio source?
I am also having a difficult time understanding how to capture microphone input from a Pixelstreaming client. I tried to make a simple UE 5.2.1 Project for testing audio capture based on this youtube video, and tried to add the PixelStreamingAudio component in the same way he added the AudioCapture Component. Locally, with my test project, the volume of the microphone input scales the size of the sphere, but when using the pixelstreaming client, the microphone input does not affect the scale of the corresponding sphere. Clearly I am missing something. I’m just not sure what I missed. I am including the link to an archive of the project. Any help is greatly appreciated. Thanks!
Hi!
I don’t know if you guys still have this issue, but if you are already putting your application on a cloud server you guys need an https connection so the browser can use your microfone (besides using the pixelStreaming audio component mentioned above). The video below shows exactly the ssl certificate creation for https.