Ok, this is a terrible idea, but I want to try it as a proof of concept as it will really help us in the long run.
We are currently uplifting 5 years of development of our game, from our in house engine (Kee Engine) into UE4, but it is a long process. Our game is very modular. A Design/Sandbox suite and a Tycoon game component using the designs/data from the design suites. The design suites are in our in house engine. 5 Years ago was a different world, with no UE4 and Unity was just taking its first steps into supporting dev on Windows. We want our game to have long legs, hence the port to UE4.
Anyway the in house engine is DX9 based. With Vista+ the DirectX API’s give the ability to create shared resources between DirectX Devices (even between DX9 and DX11). See this page for details: ID3D11Device::OpenSharedResource (d3d11.h) - Win32 apps | Microsoft Docs
All Game logic is all in Lua, it is just data models of real world items. It is the only data that has to pass between the in house game engine world and UE.
So if you have put 2 and 2 together by now. I want to effectively build our in house game engine as a plugin into UE4. Rendering into a RenderTarget that I can present to the screen (preferably via a Widget) for the designer aspects only. We can then continue to develop the new tycoon aspects while also uplifting the design suites until we can finally drop the in house engine.
Now thanks to the various plugins (such as Oculus) I have a good example of how to get to the D3D11Device and create a texture on the device etc. But what I am at a bit of a loss is to try and find how to get this into a UTexture2D or similar to use on an Image Widget.
Does anyone know where to look? Classes and function wise? And what the call/function flow would look like? Can an Epic Staffer give me a run down?
From what I can tell a UTexture is a ‘front end’ asset tracking object for the game/editor. It is partnered with a FTextureResource which is what References the Texture on the Renderer (RHI). Am I along the right track? Tracing calls and stuff this starts to go down the rabbit hole.
My Steps should be something like:
Derive my own a new type from UTextureRenderTarget2D, which implements it’s own CreateResource
Derive my own a new type from FTextureRenderTarget2DResource, which implements a new InitDynamicRHI
This is the scary part, the function RHICreateTexture2D wraps executing queued commands on the RHI. I there is no wrapped command for something as specific as D3DDevice->OpenSharedResource. Now as I am only wanting to create one (or maybe 2 in a double buffer/swap chain setup) shared RenderTarget, is there a way to paypass this command queue and do something a bit similar to:
FD3D11DynamicRHI* D3D11RHIPtr = static_cast<FD3D11DynamicRHI*>(GDynamicRHI);
ID3D11Device* DevicePtr = D3D11RHIPtr->GetDevice();
//Lock Device
//Create Shared Resources
//Unlock Device
The other problem I need to content with is where/how to put in a hook to control when to let the Kee Engine do it’s Rendering, before UE4.
It would be awesome to have some sort of overall systems diagram of UE4. What threads exist, what data/class live where, and how access/communication is marshalled between them.
Any help would be greatly appreciated.
Regards Caswal