Unreal Engine Livestream - Getting Started with Handheld AR - Nov 9 - Live from Epic HQ

WHAT
If you’ve been wanting to develop an AR project, but weren’t sure where to start, then this is the stream for you! Chance will give a brief overview of AR development and then he’ll dive into the ARSample project, expanding on some of the capabilities of the engine.

The ARSample project is available for download here.

WHEN
Thursday, November 9th @ 2:00PM ET Countdown]

WHERE
Twitch
Youtube
Facebook

WHO
Chance Ivey - Partnership Manager, AR and VR - @anonymous_user_c5ad40d01](http://twitter.com/iveytron)
Amanda Bott - Community Manager - @amandambott](http://twitter.com/amandambott)

Pop any questions you have for the stream in the comments below!

ARCHIVE
Getting Started with Handheld AR | Live Training | Unreal Engine Livestream - YouTube

Hey, nice to see Chance in his new role!

Handled AR means more focus on mobile development. This is good news for many people, including the one developing for GearVR/Daydream!

QUESTION: Is there anything noteworthy (that you could tell us) on your roadmap? Because presently it is quite empty on this matters… Or is the product considered completely finished/polished? No offence/controversy, I am genuinely interested on the future of mobile dev with UE4!

Cheers!

I will have to miss this presentation unfortunately, will there be a link available soon after to watch offline?

Hi
I will be here.
I am chearching handlhed Ar on google but i don t find it.
Can we have the url ?

Or if i understand, the photo is take to Arcore and it s use only on Samsung S8 and pixel.
On the live you talk about arcore or another pluging ?

Cool !
Waited for something like this =)

they look like the objects you could use in spore

Cool - will this be on a Mac or PC

Yep - Twitch will immediately let you watch the video online and I’ll attach the archived YouTube video to this post once that’s ready to go, usually within a day or two after it airs.

Unreal Engine supports ARCore and ARKit, for Android and Apple devices, respectively, and Chance will be discussing both of those.

unfortunately,I package ARSample with ARCore to andriod APK , then install in my Galaxy S8+, App is run but no camera’s image!! anyone knows this? please tell me something.

Looking forward to the stream tomorrow, i do have a few questions that i would like to have addressed:

1.
Is it possible if you can elaborate on why the blueprint break method for the “ARKit Frame Structure” is not present in Unreal Engine 4.18, how are we supposed to get light estimation data? See this URL for a more detailed explanation of this:
https://answers.unrealengine.com/questions/714939/getcurrentframe-no-properties-output-on-arkit-in-4.html

2.
How can we start/stop the AR subsystem at will? I can only seem to launch the project with or without AR, no toggle option. Are there any commands we can use to accomplish this?

3.
On iOS, how can i detect if the user has given the camera permission to be accessed?

4.
If the changes to accomplish the above things require me to compile from source instead of blueprints only, would i need to compile the project on a remote mac every time i need make a build, or only when source code changes happen(I really do not want to use a mac every time i need to test a minor change)?

ARCore is not supported on the Galaxy S8+.

Supported Devices:

  • Google Pixel, Pixel XL, Pixel 2, Pixel 2 XL
  • Samsung Galaxy S8 (SM-G950U, SM-G950N, SM-G950F, SM-G950FD, SM-G950W, SM-G950U1)

Question(s):

  1. How much is left in terms of ARKit integration in the engine. 4.18 introduced final Apple SDK, but compared to beta in 4.17 it’s missing some vital functions exposed in blueprints.
  2. Is there documentation on the way ?
  3. Right now, developing for AR requires to setup levels etc. with a tiny scale (0.015 scale for instance in my case), will there be any scaling modificator introduced in the future ? On the side note, doing everything in small scale makes stuff like navigation, collisions, traces not work in most of the cases, or does not work at all.

Just one question:

  1. Have you actually managed to successfully build an ARKit app and package it for distribution so it can be submitted to iTunes Connect and published in the App Store?

There are a number of issues preventing us to do so with iOS 11, Xcode 9 and UE 4.18.

Missing icons in asset catalog

Distribution certificate selection

It would be great if you could demonstrate the entire process (build a sample ARKit app, package it for distribution and submit it to Apple) in the livestream.

Any project sample for us to start or to mess with during the stream?

I too am very interested in the answer to this question. My studio is using 4.18 to create an ARKit app for distribution and it would be great to know that when the time comes, we can successfully submit to TestFlight/the App Store.

Awesome … looking forward to this one

Is it possible to interact with already placed items? For example to move them around, scale them or to start an animation?

Hi
I don t know if i miss the stream or they are late ?

Questions:

1 - Some platforms like Vuforia are integrating ARKit and ARCore plane detection with image target and *3D object detection. *Will Unreal’s native AR capabilities support image-based detection or is there a third party plugin that will provide this feature?

2 - What is involved in making an AR application cross-platform? Are the AR classes generic and will utilize either ARKit or ARCore based on the device?

Thanks!

Yep! It’s this project here.