Mixed Reality, re-lighting both game and real with UE's lighting system?

I’m considering development of AR on the Quest 3, but what bothers me is the poor lighting all the current apps using passthrough show. In passthrough I expect the lighting system of the application to interact with the real world in both ways. Game shines on real walls, real lamp lights the game.

https://twitter.com/i/status/1747357038861222337

I’ve seen relighting technology on quite some demos lately but nothing Unreal Engine related.
Is anyone using a UE plugin for this relighting method in UE?

With the quest you can scan your surroundings which are then classified into furniture types and triangulated into a 3D mesh. The headset knows what a window is, what a wall is, where they are. Other objects (if it doesn’t yet), can be detected like mirrors and fire to estimate what to render or how to treat lighting, say to fake the flickering of fire and render a mirror image. Rendering shadow is easy. Relighting, that’s the magic word. Even better if UE’s lighting system can be used as it is and inject the passthrough feed early on?

This here looks like a cheap photoshop where the lighting mismatches, not worth AR:

Handheld AR Template Quickstart in Unreal Engine | Unreal Engine 5.0 Documentation

ARCore docs specifically point to the UE docs after talking about realistic lighting but either ARCore or the person making the UE docs didn’t quite do that.

This, this is what AR should be like:
afbeelding

So what is the full picture in UE? Is it a non existing / half baked feature there or is there a working implementation?

Bump :slight_smile:

spitballing. have you tried implementing screentraces to get the irl light into the rendered image? i never wore or deved with a headset. i reckon you have a depth map to detect the world position? i could see this working pretty cheaply. i’m not aware if there plugins for reversing this. i mean you gotta generate a voxel or polygon scan of the irl environment. like a distance field. then do the lighting math with a translucent blend and reproject it onto the scan. you’re a veteran. how good is your c++? :slight_smile:

1 Like

My C++ is good. Developing is no problem just asking if someone did it before I spend a week on it and find out it looks as bad as the documentation shows :'). The room data is scanned by the headset (quest 3) and from that you can retrieve a room mesh and see what type of object stands where. Pulling such data and doing some cheap post effects can be done that’s sure, but what about integration with unreals lighting / shader system? to what level is what I wrote working out of the box without dirty hacks or workarounds?

  • realtime.
  • lighting real world from game world.
  • lighting game world from real world.
  • proper dynamic lighting, color blending, shadows, transparency and so on.
  • realtime reflections, mirrors.
  • depth estimation (realtime normals estimation) of real world (faces) for relighting.

Object occlusion (game character hiding behind real object) should be present in new version of ARCore from what I quickly read but from forum posts it seems EPIC is not updating / supporting anything of the new stuff 4.26+, is it dead?

Anyone developing VR AR here?

:recycle:

:rocket:

Bump once more :), bit surprised there isn’t as much activity on VR / AR here as I thought

:headstone:

1 Like