Manual Stereo Rendering for Mobile (Android) VR

Hey everyone,

I’m working on a mobile AR project and I’m trying to set up a manual VR split screen since Mobile Multi-View doesn’t work on my phone (LG Wing with Adreno GPU - needs Mali apparently).

What I’m using:

  • Unreal Engine 5.4.4
  • Android ARCore
  • Handheld AR Template
  • LG Wing phone (Adreno 620)

I want to create two UMG image widgets side by side - one for left eye, one for right eye with a slight offset. Basically manual stereo rendering since the built-in VR stuff doesn’t work on my hardware.
The problem is I can’t find the blueprint nodes to actually get the AR camera feed into my materials. I’ve got:

  • UMG widget with two image widgets
  • UI material with Texture Sample Parameter 2D
  • Dynamic material instance created in level blueprint
  • Toggle AR Capture working

But there’s no “Get AR Camera Image” node anywhere. I’ve searched with context sensitive off and everything. There’s a “Camera Image” category but it just has Get Height, Get Width, etc - no actual texture getter.

I’ve tried:

  • Engine content passthrough materials (they’re just transparent, not actually capturing camera)
  • Searching for every variation of AR camera, camera image, etc.
  • The Toggle AR Capture works but doesn’t unlock any new camera texture nodes

Anyone know how to actually get the AR camera texture in UE 5.4? Or is there a completely different approach I should be using for manual stereo rendering on mobile?

tl;dr

Trying to set up stereo rendering on android app manually since built in VR features arent working. Anyone know the best way to do this? Any sample Blueprints I can study? Any and all help is immensely appreciated. Super super appreciated.