Download

Using an external application's frame buffer as a texture

I am investigating the possibility of using the frame buffer of a second DirectX process as a texture within Unreal.
The research I have done so far suggests it’s possible to hook into API calls to DirectX and leave me with a copy of the frame buffer as a D3D Surface.
However, once I have this I am not sure how I can turn it into a texture for within Unreal.

My thoughts were to extend the built-in texture class to handle this, but as Unreal behaves generically with no reference to DirectX,
it seems like I would be converting to a generic format and then the engine would convert it back again - all of which will cost valuable ticks.

I realise this is quite an ambitious thing to try and do, especially as I am new to Unreal (I have been looking at Unity too, but the sealed classes stop me doing something similar).
I doubt anyone has done this exact thing, but if anyone with any remotely relevant experience would be willing to give their advice, I could really use it.

I haven’t started implementing anything yet as I’m not sure it’s even possible, I’m just hoping to gather enough information to give me reasonable confidence this might work and formulate an executable plan.

Thanks

This kind of thing is done in Coherent and Radiant UI toolkits, as that’s how you get an HTML based UI to render. You may want to take a look at the code in Radiant (as it’s open source), and see what they do.

The performance is decent, but I’m not sure what you’re trying to accomplish.

Thanks for your reply, but both of those extensions seem to be regarding Web technologies and as I said, I intend to hook into games or 3D applications using DirectX.
Radiant seems to essentially generate a webpage as a texture, so this has nothing to do with hooking into DirectX.
It does look like a useful tool for making UIs using HTML, but I don’t think it helps me with my problem.

To clarify, I want to have a DirectX based application running in parallel to UDK, and have some custom code hook into that second application, obtain a copy of its frame buffer, and allow me to use that as a texture within UDK, preferably with low latency (So ideally I don’t want to be copying it over and over, particularly between System and GPU Memory). Eventually, having that external application produce a stereoscopic image I can split in half, then use each half for each eye in VR; but that’s a secondary goal.