[REQUEST] Finding the Edge of the Surface the Player Is Standing On

Hello! This is my first post on the Unreal Engine forums, so please tell me if I’m doing something wrong :slight_smile:

I want to make a system that determines the player’s distance in real time (in the direction they’re facing) from the edge of the surface they’re currently standing on, as well as the distance of the gap between that surface and the next surface in front of them.

Here’s a picture to demonstrate:

DynamicEdgeDetection.jpg

Essentially, I want to find d1 and d2 as they appear in this diagram relative to the yaw direction the player is facing. I would like to have the values of d1 and d2 update each tick (or as necessary to make them change as the player, camera, and/or world moves).

I’m very new to using Blueprints in Unreal Engine and rather unfamiliar with the language’s documentation and general workflow, so I don’t know how easy or hard this would be to implement. If it would be unfeasible to implement, then I could probably experiment with a different system that accomplishes a similar goal.

If anyone would be willing to provide some code that accomplishes this, or at least some nodes I could try fiddling with, or even to tell me it’s impractical, that would be much appreciated.

Thank you, and have a nice day!

I have to tell you that looks like a bit of a nightmare. Especially with the amount of surfaces you have to trace in a real game situation. Can I ask what you want to achieve?

Sure thing. It’s for an end-of-year school project where I want to try making an accessibility system that allows someone to play a simple platformer game without visuals; basically a game that could be played by a blind person.

What I wanted to do with this system was tell the player how far away from a gap they were and how wide that gap is, so they would know when and how far to jump. In practice, this would probably take the form of a constant sound that would increase in volume the closer the player was to the gap and would be played with a pitch based on the gap’s distance.

I don’t know if they would be possible, but I have a few solutions in mind. One is that I could have a capsule move from slightly below the player to a certain distance in front of them every set amount of time and report back the distance if/when it stops detecting ground and if/when it starts again. Another is that I could have a more digital form of detection where I have 10-ish hitboxes spaced based on the player’s speed that provide rough information regarding where gaps in the floor start and end.

I think if you try and do this with ‘normal’ objects you’ll be there forever. One approach would be to make all the ‘platforms’ etc blueprints which have special collision volumes, so when you’re doing a scan you’re only looking for a finite set of things.

You can have components in the BPs that mean ‘edge’ etc, much easier to search for.

Get actor bounds, get component bounds etc … These will return the “extent” of an actor/object. You can use the extent to calculate distances to edges.

Okay, thanks for the suggestion. Would too many blueprint actors in one level start to cause performance issues?

Alright, I’ll try fiddling with those and see if I can get something to work. Thanks!