Controls in HTML5 for Mobile and Desktop simultaneously

Hey there,

I am fairly new to Unreal and starting to develop a small Gallery Exhibition of Photos in 3D.
So far I could easily build the 3D assets, place them etc., but I cannot wrap my head around how to make the project controllable from Desktop AND on mobile devices (Android & iOS).

Moving through the space should be possible with Virtual Joysticks (and WASD Keys for Desktop)

My problem is that in:

→ Project Settings → Engine → Input

I have set up the Virtual Joysticks to be Alway visible and have set up some Mappings for moving forward, backwards, up, down etc.

This made it possible to navigate through the map on the Desktop Browser by clicking and dragging with the Mouse on the Virtual Joysticks, but on my Smartphone Browser I cannot click and drag on them.

I have tried to assign “Touch 1” to the Main Input Keys of the DefaultVirtualJoysticks Template, but that did also not work unfortunately.

Did I miss something here? Is it actually possible to use the Virtual Joysticks in a HTML5 packaged Project on a smartphone Browser?

1 Like

There is nothing on the HTML5 template that hooks the webpage Touch events to your project wasm. As fair as I know, it won’t work without some hacks on the HTML5 page (like convert touch to fake keyboard inputs) or lots of customs changes and updates on your engine source code.

2 Likes

figured out that “clicking” with a finger works fine out-of-the-box, e.g. for UMG buttons but touch-movement is not working by default…

here is some hacky solution for this which gets Touch-Start, -Move and -End events working in Widget-Blueprints on Mobile (touch) and Desktop (mouse) HTML5 with UE4.23 (without engine-source modification, sorry no multi-touch):

in the UE_4.23\Engine\Build\HTML5\project_template.js add this, e.g. at line 103:

const touchableElement = document.getElementById('canvas');
touchableElement.addEventListener('touchstart', registerTouch);

function registerTouch(event)
{
	const mouseDownEvent = new MouseEvent('mousedown', {});
	event.target.dispatchEvent(mouseDownEvent);
	
	touchableElement.removeEventListener('touchstart', registerTouch);
	touchableElement.addEventListener('touchmove', handleTouchMove);
}

function handleTouchMove(event)
{
	// Handle touch move event
    event.preventDefault();
	
	// Create and dispatch corresponding mousemove event
	const mouseMoveEvent = new MouseEvent('mousemove', {
	  clientX: event.touches[0].clientX,
	  clientY: event.touches[0].clientY,
	});
	event.target.dispatchEvent(mouseMoveEvent);
	
	touchableElement.addEventListener('touchend', handleTouchEnd);
}

function handleTouchEnd(event)
{
	event.preventDefault();

	// Create and dispatch corresponding mouseup event
	const mouseUpEvent = new MouseEvent('mouseup', {});
	event.target.dispatchEvent(mouseUpEvent);
	
	touchableElement.removeEventListener('touchend', handleTouchEnd);
}

in the Widget-Blueprint, overwrite the OnMouseDown, -Move and -Up functions, with their code like this:



The handleTouchMove and handleTouchEnd do overwrite the default click-behaviour and resets it when releasing the finger. In BP we though need to skip the first move-event (due to wrong move value), but we use i to manually process the MouseDown (since the default click behaviour is overwritten).

The method above does not work on Desktop (mouse), so we use a registerTouch event to enable the above Touch behaviour only when a TouchStart was detected in the Javascript. For some reason the first TouchStart event always returns the top-left-corner of the game-canvas for the MouseDown event (instead of the real touch-location…). since this seems always like x=7.163 (on android mobile-portrait and -landscape and windows edge) we can exploit this to enable the Touch-mode in Blueprint only when a TouchStart was triggered once (otherwise the default mouse-behaviour stays as is).

very weird, but so far it gets the job done…

1 Like