Box Trace does not register correctly the impact location

I am casting a BoxTraceForObjects and I am getting some weird results.

Let’s assume I have a vector AB, where:

  • A is a 3D point and the start location of the trace, coinciding with the actor location
  • B is a 3D pont and the end location of the trace, obtained by taking the direction of the actor and being arbitrarily set to a specific unchanged length throughout the tests (more specifically double the capsule component radius).

The box sizes are set to default 0 on X, while Y and Z are arbitrarily given some values (more specifically, Y is half the capsule component radius, and Z is the half height of the capsule component).
The box has an orientation set to be the same as the direction given by the vector AB.
The object types are all of them, excluding Pawn.

What I expect to happen when I trace using the above described parameters:

  • The box is traced along the AB vector, using it as the center.

  • When the box hits an object, it recognizes it, and registers correctly the following:

  • The Out Impact Point should be at the first point hit from the object (NOT ALWAYS HAPPENING)

  • The Out Hit Point should correspond to the out hit point on the AB vector (not happening, however, it is correctly calculated)

  • The angle between the Out Hit Point and Out Impact Point should be of 90° (NOT ALWAYS HAPPENING)

The issue is that, although the Out Hit Point is correctly representing the BoxTrace Hit location, the Out Impact Point is not.
To represent this issue, I have traced lines from the starting point to the Out Hit Point and Out Impact Point, but also between the Out Hit Point and Out Impact Point, which can be seen in the attached pictures.

This issue seems to happen when the box hits at given angles the object, however, I can’t figure out the exact details or why it is happening like that.

I am not using “Trace Complex” which seems to not exhibit this behavior.

The expected behavior is shown in the following picture, which happens most of the time.

The unexpected behavior is shown in the following picture:

The object the trace is supposed to collide with is a typical box scaled on the x, y and z axis. The original collision for the object looks as below:

This behavior is the same regardless of the type of box trace I am using (object, chanel, profile) and does not seem to be present with line and sphere.
Is this normal? If so, can someone explain me what is happening and if I can get the behavior I expect without using complex collision?
Thank you in advance

Try giving the box some X dimension and see if it begins to work. Even a dimension of 1. You need to think that you aren’t really making a true “box” if you only have dimension on 2 axes, so it might be messing with the intersection calculations.

That was something that I did not consider, but it really seems to be the root of the issue.
Changing the X of the Half Size for the box to a non 0 value seems to fix the issue for every situation in which I noticed the problem.
Thank you