Touch input while keyboard is open

Hi There,

I’ve implemented a simple keyboard interface for iOS. This has a UIViewController with a UITextField. Callbacks into my UI code populate the text as you type, and I’m trying to move towards drawing the text in the engine only and not using the UITextField.

The text entry is a partial view only and I’d like to be able to interact with other elements in the scene (these may be 3d models, custom UI elements etc) while the keyboard is up. The problem is that touch events are not making it through to my Unreal PlayerController while the keyboard and my custom UIViewController are visible.

The game update and draw functions are being called as normal, just no touch events.

Anyone dealt with this before?

I’ve tried a few approaches to this, the most recent being a custom UIView for the UIViewController, that simply filters pointInside requests - apparently correctly.

But still no touch events get to my PlayerController.

I’ve got my missing touch inputs being fed into FIOSInputInterface (same way as SlateOpenGLESView::HandleSlateAppTouches) but they’re still not coming though to the PlayerController.

Given than I process all my 2D UI events directly I could probable handle this myself but it seems better to have the system work as designed.

I’m handling my touch events directly (i,e, detecting the iOS UIView events and passing the events to my input handling code).

For now I don’t need to click on actors, so it’ll work for a while.

I would still love to solve the original issue.

I’d like to put the app to sleep on a semaphore sometimes, and trigger it when the user touches the screen to maintain responsiveness and save battery when in states that don’t need updates.

Might need to build the engine code myself for iOS to do this?