I’m quite new to UE and my question might be stupid, but I haven’t figured out a way to play an AR Game without deploying it to a physical device (which takes a lot of time on Windows and huge amount of time on Mac OS). Mobile Preview gives a reasonable error (AR Session is missing/not configured), I am just wondering if there’s no way to simulate an AR Session within the editor.
yes you can. Not stupid at all, it saves a lot of time.
Install the Unreal Remote 2 from the app store to your mobile device. (make sure it is version 2, version 1 is a separate app.)
from it’s instructions:
This streams the editor viewport to the device and sends device inputs to the editor.
The video streaming to the device is choppy even on my i7 6-core and 5Ghz Wifi, but still very useful. Nonetheless inputs from device to editor are very responsive.
For AR testing there is one more thing to setup to make the editor receive motion tracking inputs.
in the Config folder of your project, edit the DefaultEngine.ini and add these lines:
I don’t remember how i found this out, probably by sniffing in the Virtual Camera example project in the learn-tab.
It worked for me with iOS devices connected to the UE4 editor on Mac and PC.
How to deal with AR related Errors in the Editor:
It makes sense, because there is no AR session running on the PC/Mac, so these nodes for example will always fail. So it’s useful to guard these functions during development by figuring out if the game runs on a mobile device or on desktop during runtime and setting a bool accordingly (IsOnDesktop or something like that). I used the blueprint node get platform name
Hope this helps, we AR devs sure need any support, we can get
Read the app description for Unreal Remote 2 in the app store. I also pasted these above.
It says to choose “Play in New Viewport” in Unreal Editor (even though it is actually called "New Editor Window". The author seems to be not too familiar with the engine), not simulate. Simulate is something different
I have been taking that same remote testing approach (Unreal Remote 2 app, edits to the .ini file, and the AR session check) with iOS app testing to save time deploying between a PC/Mac and an iPad. Many thanks for documenting the steps as a resource for all. I use this to test the UI elements in the app I am developing but not the AR portion of the app since the AR session is not running. As a result, while the the Remote 2 approach helps with some aspects of the iOS app development process, I still need to deploy to the iPad to test the app’s AR functionality.
But if I have misunderstood something in the documentation or if you happen to know a way to remotely test AR functionality in an iOS app development, I would be most grateful. I have been hoping that there might be some progress in developing a remote session for UE4 to iOS specifically for AR functionality much like the HoloLens streaming via the Holographic Remoting Player, but I have not encountered it yet.
Again, thanks for outlining the documentation; it is very much appreciated.
with the remote app the ipad/iphone movement will drive the player camera in the UE4Level-Editor, when played in editor (PIE).
It is a way to move the player camera with physical motion and all touch interaction will work in the editor.
For example a touch input with hittest will work. Movement and interaction (collision with player, if set) are there, beyond only UI.
(fast wifi and cpu on Mac/PC very recommended)
An AR camera footage from the device is of course missing. Rotation and Panning is measured and sent to the Editor by using the gyro and accelerometer in the ios device. It works surprisingly well, even-though it is not an actual AR session.
No actual tracking data is sent to the Editor. But it gives the feel and functionality in an instant to test many things. It is not perfect, but goes quite far when Debugging things that do not rely on AR Camera footage by the device.
Of course all AR-related functions will returns errors. So i guide these with a simple bool, see image:
I would be great if the Remote App actually started an AR Session and sent session data and camera footage to the Editor. Would be a great feature request!
But since Epic’s enthusiastic promises for mobile AR early last year, support became so halfhearted and sluggish, i find this extremely unlikely
So it’s great if we share tips for workflows.