Hello everyone,
I am looking for some pointers to better find my way around the engine source.
I am trying to implement an API provided by a different application that allows me to transfer a rendered frame (or any texture really), so it can be displayed together with some other 3D stuff.
I have already had success by modifying the OpenGLDrv by hooking into RHIBeginDrawingViewport and RHIEndDrawingViewport.
But I would like to make this less hacky and in the editor I would like this to transfer only the scene viewport instead of the entire window.
The required steps look something like this. (Sorry that I can not be more specific, but the API is not officially released)
// all calls assume an active OpenGL context in the case of OpenGL
// Initialize with the native contexts
Init(dxDevice/glContext);
while (1)
{
// lock frame also gives a delta time which i might want to set as the global delta time (UEngine::UpdateTimeAndHandleMaxTickRate ?)
frame = LockFrame();
// do rendering
// copy the rendered contents into the provided textures
Blit(renderbuffer, frame.colorTexture);
Blit(depthRenderbuffer, frame.depthTexture);
UnlockFrame();
}
//Cleanup resources
Exit();
I am aware of FRHICustomPresent which might be appropriate for this, but I am not sure where I am supposed to assign that to the renderer.
Another idea was to make this a HMD plugin with disabled stereo rendering, since the usecase is very similar.
If someone with more knowledge about the engine source could give me some opinions, that would be great .