And what I’m after, is a tutorial or explanation on how the following effect can be done:
I want to have a real-world image trigger a blueprint of a basic plane to show up over the real world image.
Meaning after understanding how to track a basic plane (I assume somehow revealing the world transform of the image), then from there I could use any image to spawn any sort of blueprint - just by attaching it to the transform ‘anchor’ of the image.
I feel as if this AugmentedImages template with the frames already has what I’m looking for within it - I just don’t know where I should be looking!
Appreciate it!
EDIT: I found this in the google document: “In your project, you can use the Blueprints or C++ function in Unreal’s Augmented Reality module to get all tracked geometry and try cast to GoogleARCoreAugmentedImage type. When an image is detected by ARCore, you will be able to get an instance for this GoogleARCoreAugmentedImage type. Then you can query the name, index, transform, and extent of it using the Blueprints or C++ function.”
Perhaps if someone had done the above in blueprint before - perhaps a screenshot to see how its done? Thanks in advance.
Can anyone help me with this? I have the same issue and have been scouring the net to try and find any sort of tutorial or walk through and after a week, still nothing. I’ve gotten as far as the loading images into the data asset as described in the tutorials linked above, but from there the tutorial ends abruptly. There is not enough explanation on where to import the 3D model / animation to, how to link these to the images we imported and so on. It’s really frustrating as to try and get the required information for this has so far been a hunt across the internet, and I’m probably not half way there yet.
I haven’t tried that sorry. You could try and swap out the ARCandidateImage Data Asset for the GoogleARCoreAugmentedImageDatabase and see if you could get it to work that way using the same method.
Hello,
I have already tried the ARKit example with the ARcore and It works. The differences is that instead of use de Unreal Engine AR config Data base, it only works with the Google ones. If I use the google AR data base tools I am able to detect the images. At least, it is what the show planes debug modes shows. I am stuck in this step, I am not capable of spawning any image. I can see that the app recognize de Image, but I don’t know how to spawn de BP_Placeable, I was wondering if there is someone willing to explain or show how to configure properly the placeable geometry.
Looking also for a tutorial. I tried so many different methods but I can’t track any image. ?!
Could anyone explain a method for imagetracking with ARcore?
Hi guys
I need to create a simple Augmented Reality application
I was able to make the camera phone recognize an image and appear my 3D.
But it appears on the axis of a “World Origin” which is where I started.
Does anyone know how I make 3D appear where the image is?
I have few quetsions, how i can set the actor rotation with the image?
I would like that the actor rotate if the image is rotated!
And another question if is possibe…how i can add more than one single actor to spown on a different image? (previously inserted in the database)
I’m using this template and i was able to spawn an animated model importing the fbx and placing the mesh in an actor…now i would like spown differnt models on differnt images…and set the ability of those models to rotate if the image placed phisically is rotated…sorry for my english but i cannot explain it better than this….thank you in advance.
From the ARTracked image you should be able to query that tracked images transforms and apply those to the actor you’re spawning. In the logic for the tracking, you can update the spawned actors transforms each frame for instance to align with the image (with some logic to check if that image is being actively tracked)