Utilizing Unreal Along Side Another External Application

So, I have a fairly unique system requirement that I am unsure how to handle.

Right now, I have my unreal application (which is using nDisplay, but I don’t think this matters for this interaction) that runs on multiple monitors. I also have a touch screen device that is plugged-in and I run a different application on this touch screen. Both the unreal application and this application connect to an external application that is a server for data (non-unreal).

The problem I am running into is that if I set the inputmode to gameonly, this ensures the mouse input always stays within the unreal application, which is needed as there are user interactions within the unreal application that is needed.

But, when I touch the touch screen, the unreal application doesn’t relinquish focus, keeping the mouse stuck within the unreal application and sending the mouse down/up within unreal.

So, I attempted to resolve this by subscribing the FWindowsApplication::AddMessageHandler, pushing a custom IWindowMessageHandler and then call the Windows API call of RegisterRawInputDevices to receive raw input, which I believe should be before Unreal has processed any input.

From what I can read, inside this callback, if I return true, it should mean Unreal no longer processes the message further. What I do is if I notice the rawinput is coming from the touch screen (via vendor/product IDs), I set the inputmode to uionly so Unreal no longer locks the mouse to the application.

This seems to generally work as now when I press the touch screen, I see the mouse move over to the touch screen application. But the problem is that, which checking spy++, this external application never receives the initial mouse down that the initial touch should be providing, thus the unreal application never loses focus and the touch screen application never gained focus.

From what I can tell, the drivers for the touch screen makes all input seem to look like mouse input. I can uninstall the drivers and utilize the Windows digitizer drivers, now the interaction acts differently.

When using the Windows drivers, Unreal immediately loses focus when I touch anywhere on the touch screen and the other program receives focus. But, in utilizing the Windows drivers, this presents issues for this external application as it was build with the other drivers in mind.

So, my question, does anyone have experience in how to handle input from an external device that the unreal application is not running on to allow input to flow to whatever program is running on that device? Unreal almost seems to be built in such a way to never allow this type of interaction as gameonly mode will always aggressively keep the mouse within its viewport, no matter what I do. And trying to swap inputmode from rawinput is causing some type of Windows message to be consumed and not passing to an external application because Unreal never loses focus.

Utilizing the Windows drivers seems to get around this as it forces the Unreal application to lose focus before processing any other input, but this causes issues within the other application. We are currently trying to resolve this from both ends, seeing how hard it would take to rewrite this touch screen application to work via the Windows drivers, but I wanted to see if someone has experience in getting touch screen outside the unreal application space to play nice.

Any help would be greatly appreciated!