Projecting touch into world space is terrible

Sorry in advance, I only post when things are terrible. And this is surprisingly bad.

So here’s a bit of logic:

And here’s the result:

The cursor is near the middle and the debug mark is in the bottom right. That is way, way off where it should be.

I’ve had lots of advice about how to do hit detection under cursor by channel, etc. But none of these things exist in the abstract. The touch interface events don’t fire if you’re not touching a game object and the hit detection routines don’t work when your cursor isn’t above your object, such as in a dragging with lag situation.

There are zero reliable ways to convert screen space into world space.

I’ve sat here and tried to solve it in different ways with line/plane intersection and other tricks, but ultimately without a good way to convert screen space into work space it’s a mess.

How is everyone else solving this?

Also, I’d just like to ***** real hard for a moment about how it’s impossible to convert a transform from local to world space in blueprints. Why is this not a thing?

To have it work properly, you also need to use the World Direction output of the Convert node, like you would do in a Line Trace By Channel. As a matter of fact, you may need use a Line Trace By Channel to detect the intersection point with your cards. It depends on how your world and camera are setup.

Right now I’m assuming my world and camera are default. I’ll try line trace by direction though, see if I can get an accurate hit under the mouse. Thanks!

Update: you were absolutely right. Although the start point is ridiculously off, using the direction to trace a hit seems much more accurate, at least at close range. Here’s the answer for anyone else having this: