Hey Dear Unreal Team,
When you tell us that Unreal is made for AR, than this should not be an empty promise. I know you are busy with Fortnite but when you say “the most powerful creation engine” it has to be able to handel AR as well. Not only HoloLens but the ARKit, too. So how can we do Object Tracking in ARKit? It is the year 2020 now. Was this ever really implemented or was this just a marketing fluff? Because my company promised the client that AR is working in Unreal but it don’t!
Same thoughts. Very relatable.
Learning Unreal Engine and implementing it in a workflow is nothing that is done on a weekend.
It is a big commitment, based on trust that Epic will keep it’s full-bodied claims and promises at GDC, WWDC and marketing claims.
There are people trying to build a career and/or customer base with Unreal Engine technology.
So naturally it creates frustration when being let down after investing so much time in Unreal Engine. And time is money. Start-Ups and Freelancers don’t have Fortnite money as backup.
Hopefully this explains the frustration and no fan-boys are triggered, when i write:
After two years of frustration, the final verdict is: UE4 was betting on the wrong horse for Augmented Reality.
It hurts me to write that. It is otherwise a great engine, that i loved to work with.
Sad thing is no one at Epic will read this. The only chance to get Epic even talk to us is an 1500$ dollar investment for access to Unreal Developer Network.
But why would someone consider this if AR-plugins are outdated, incomplete and so far every release of ARKit and ARCore was quite buggy.
For augmented reality developers there is no reasonable basis for acquiring a Custom License.
If you’re not interested in the kit but would like help with object detection here are the steps I used (ARKit only):
1 - Set your session type to “object scanning”
2 - visualise the ARpoint cloud with debug points so you see what the ARkit is scanning
3 - use the “Get AR candidate object” node to capture the object when scanning it with your device
4 - Save the captured object data to a slot on your device
5 - Use iTunes to access the save file on your device
6 - load the save game from within the blueprint construction script to set **public variables **in a blueprint
7 - Drop the blueprint into the world, the object data you scanned should now be visible in details panel
8- Copy the data from the public variables
11 - Create an “ARcandidateObject” by right clicking in the content browser then /misc/data asset
12 - paste all the data in here
13 - add the ARcandidateObject to the ARsession config
11 - change the session type to world
12 - in your AR pawn blueprint add these nodes: event tick - get all ar geometries - for each loop - cast to AR tracked object - get detected object - get friendly name - equals (the friendly name of your object) - branch - (if true…whatever you want to happen when the object is detected)