How can I output a few SceneCapture (or any camera) as individual network video streams?

Hi everyone, I’ve got a simulation project at work I’m unsure how to tackle.

It’s about an Unreal Engine scene, and in it, there would be a few cameras capturing what they see.
There is one hard requirement that I don’t know how to meet: These cameras need to individually output their video signal on the local IP network. (In essence, UE should simulate a bunch of IP cameras)

Here’s a few remarks:

  • The video stream protocol is flexible, as long as it’s IP-based: RTMP, RTP, RSTP, Websocket, or even just raw UDP is fine.
  • System performance is not a concern here. For instance, we could imagine five cameras capturing a very cheap scene in 360p at 10 FPS.
  • Each camera may have a different resolution and framerate. Like a first one in 320*240 at 15FPS and a second one in 480p at 10FPS.
  • Outputting all cameras in a split-screen fashion in one video stream is a big no. I need each camera to have a dedicated video stream at a matching framerate. (a less-than-ideal solution would be to incur the cost of a post-process decoding the split-screen stream, splitting it up, and re-encoding each stream)
  • I can only spare one computer for this. (I can’t afford to set up one computer/UE instance per camera, and then sync all UE instances.)

I’ve looked into the media framework, but it seems not to support outputting render targets as network streams.

I’ve also looked into the Pixel Streaming plugin, but I see no option to configure multiple streams from different cameras/capture components.

Has anyone got any guidance or clues for how to set this up?

The solution is called GStreamer, and someone made an integration of it:

Using this, I’ve managed to have a few render textures streamed over the network.