AR on PC

With ARCore & ARKit on mobile devices we can create AR experiences with object tracking, is the same possible to do with live footage either through engine or via an exported PC application.
I am trying to create a live broadcast with AR capabilities, meaning I have a freecam observer that pans on open areas in match where I would like to place some AR elements that track on to the wall. / stay on the ground as the camera moves.

I searched a lot but could not find any relevant information / can someone refer to a plugin or application or guide me towards the approach.

Thanks in advance.

Yes, you can do exactly what you are describing. In film & TV, this is usually called Virtual Production. Unfortunately, Virtual Production is an umbrella term that can mean lots of things. These days when you say Virtual Production most people think you are talking about using LED screens, but that’s only one “form” of Virtual Production. If you do a YouTube search for “Unreal Virtual Production” you’ll find a decent number of videos showing how to set up something like what you are asking about.

here is one example.
https://www.youtube.com/watch?v=OgxfvtT17CY

-e

This is much more if you kind of have a camera with a attached tracking system and a chroma setup, however what I am talking about is lets say a video game like Valorant or Counter Strike and I want to place object like how we do in ARkit / Core, it generally maps the visible area and then attaches some points so we can place the 3d object, but in my case we do not have a camera/tracking system but it also is in virtual 3d space

I’m not sure i understand. But if you are just trying to place objects in a pure CG/Virtual scene based on clicking on the screen, then you can just do a LineTrace and then feed the hit location into SpawnActorFromClass.

-e

Sorry if my explanation wasn’t clear enough. Here’s an example of what I am trying to say https://serwer2115171.home.pl/lhm_videos/Augmented_Reality/1.%20ar.mp4 .

Basically I am getting a clean feed from Valorant, from Cinematic Observers, that do flybys, and can pan, orbit in open areas smoothly, and I am trying to track that real time and place graphics on that, with the new release of 5.4 and addition of Motion Design inside of the Engine I feel this should be even more achievable. Also can you link some resource/ Docs for Line trace in Engine, live I could not find anything useful. Thanks

With the new Motion Design tools you can certainly create content in the style that you’re describing. You can find cool examples from various folks on YouTube.

I think one key point you’re trying to get at is the tracking of the environment. Like it was said earlier, in traditional broadcast they are truly tracking the locations of the cameras and field, making it easy to anchor content. On mobile, ARkit and ARcore are being leveraged to use cameras and sensors to do the same thing. These are very specific to the hardware and OS, and Unreal Engine is not doing the tracking itself, just calling APIs on the device. So if your goal is to get a generic video feed and auto-detect ground planes to anchor content to, there’s nothing built in that can really help you. The best case scenario you’d hope for is having predetermined camera paths that you can build your content around, to essentially comp two shots with the same camera. Ideally you can come up with a flyby, pan, and orbit camera movement that you can replicate every time in Valorant. But to track a FPS video feed in real time and project content onto walls and floors is not very feasible.

I had assumed that you were trying to do this in your own game, not with live video footage from someone else’s game. If all you have is a video feed/footage then there isn’t much you can do in real-time.

The Phone AR stuff only works because it’s running on a phone and the phone is figuring out how you are moving the phone.

If you don’t need to do it in real-time then you can probably run your video footage through some VFX camera tracking software and produce an animated camera that you can then use in Unreal to render out the graphics and composite them over your video footage. There are lots of different software and ways to do this. The basic idea is the same as if you filmed the video footage with a camcorder.

-e

Thanks for your inputs. I guess for now I would just have some flyby pans and orbits prerecorded and tracked and brought into unreal for just graphics and then sending a clean feed to overlay. As for recreating the paths, at least in valorant that is not yet possible, some other games support it but its simpler to then just prerecord multiple movements prior. Again thanks for your answers. Will update the post if I find something that might be useful.

Use Kinect

Isn’t Kinect for human movement tracking? How would that be helpful to track a live video feed of a game, plus wouldn’t that need the Kinect hardware and another software layer to process the feed to input data back to UE.

Live video is possible to UE with Kinect. USB 3 bandwidth is sufficient. Also can assign skeleton tracking co-ordinates and map your required features. I have no time to test it out even though owned hardware. There are many achieved with this