Dealing with Scale in Augmented Reality

I’ve been experimenting with arcore in Unreal, and it’s really cool. Since we only have horizontal surface detection at the moment I prefer the idea of making a game with small characters that people can look down at while they play. I tried just scaling down my assets, but that creates all sorts of issues. I could create all my assets with this smaller scale in mind, but that’s not really feasible for me right now. I saw this article for Unity about scaling the camera and the planes up so that the game’s world looks smaller because it is farther away.

Source: https://blogs.unity3d.com/2017/11/16…h-scale-in-ar/

I’ve been playing around with implementing this, and scaling up the camera does cause the characters to look smaller which is great. What I’m struggling with now is scaling the planes and the google cloud point component. No matter what scale I set it or the root of the actor containing it to, it seems to stay the same. Has anyone else tried anything like this yet?

So it turns out that the solution to this ended up being changing the world to meters setting. I had read on a couple of threads on these forums that this wasn’t supported for AR yet, but it works great.

Has anyone else used World to Meters scaling in the world settings for AR? I’ve been setting this as well, but haven’t been able to scale down the scene. Does this somehow get overridden when converting local to world transforms for world spawns or something?