Project World to SceneCapture2D & make interactable

I’ve projected my game world to a orthographic scenecapture2d and displayed it as a UI widget so that it can be scaled / dragged around the players monitor, however I need it to still pass mouseover and click events through the render texture into the gameworld. For example, highlighting an actor on mouseover , triggering a ui popup when said actor is clicked etc… Essentially becoming interactable in the same way as the standard game world.

Can anyone suggest ways to do this in blueprint? I’ve tracked down a couple of suggestions for similar issues, but cant get anything to work.

Cheers.

To click stuff behind a widget, I think you just need to set the behavior->visibility option to Non Hit-Testable in the widget bp. Or if you’re trying to interact directly with the rendered version of actors in the capture, you’d have to do a bunch of math to figure out where pixels on the capture were in world space, and probably do line traces to the locations. Which would really only work if the capture is run every frame, otherwise actors might not still be where they were when texture captured.

Mesed around with it a bit, and my first idea was to do this:

Did this in a side scroller project so I could just map the range of the x/y to the extents of the map (and even then, the math is probably still off). If your scene capture isn’t fixed to a side or top view, you’ll probably need to pass the normalized-ish vector2d to the capture actor or something and have it trace/spawn/whatever in local space.

Cheers for the response and suggestion!

The plan is indeed to be able to directly interact with elements within the scenecapture, which will be updated every frame.The players camera would be ortho, so pointing down on the game world at a 45 degree angle.

I’ll give your suggestion a go and see what the results are like. Much appreciated.

With the capture cam being at an angle, you’ll probably want to send the click results from the widget to the actor that has the cam (probably want to do this anyway instead of doing all the logic in a widget lol). Since it’s ortho, you could get its upvector and rightvector and add the results from widget to get a start loc for linetraces against the real world. It will lose a bit of accuracy in translation, depending on the resolution of the capture tho. In my test I could only spawn the health items about a meter away from each other since the map renders about 1 pixel per 100 units..