Best approach to received frame as raw buffer from a plugin (video) and display it on a UMG widget image

I have a plugin integrated in my ue4 project that wraps a dll that receives video frame from another software (webrtc is behind this). Now, I want to display the received frames on a UMG image widget.

<br/>

The received frames are already decoded. What I receive are : uint8* buffer, width and height

<br/>

I want to know what would be your approach, and what classes would you use to make this work in real time ?

<br/>

Thanks in advance

You can create a Render Target and populate it from your buffer. Reference it in a UMG Image widget for example.

For reference:

`ENQUEUE_RENDER_COMMAND(WebRTCPopulateRTT)(
[RTResource, Buffer = MoveTemp(BGRA8), Width, Height](FRHICommandListImmediate& RHICmdList)
{
const FTextureRHIRef& TextureRHI = RTResource->GetRenderTargetTexture();

RHICmdList.Transition(
FRHITransitionInfo(TextureRHI, ERHIAccess::Unknown, ERHIAccess::CopyDest)
);

const FUpdateTextureRegion2D Region(0, 0, 0, 0, Width, Height);

RHIUpdateTexture2D(
TextureRHI,
0,
Region,
Width * 4,
Buffer.GetData()
);

RHICmdList.Transition(
FRHITransitionInfo(TextureRHI, ERHIAccess::CopyDest, ERHIAccess::SRVMask)
);
}
);`