3D Widget/System to drag objects in a running game

I have a space game where the player needs to create some waypoints.
Now, being a 3D game, placing a waypoint in space and changing its position in runtime are quite tricky.

I would like to implement on my waypoint/Actors the same arrows which we can find in the editor viewport (see image below).
Has anyone ever tried to create the same system within a game?


Thank you!

The initial placing is tricky. Moving is pretty straighforward, though. I’d iterated through several systems before settling for the most intuitive and hassle-free I found, revolving around a modifier key.

Without the key, you move the marker on a horizontal virtual plane produced by line-plane intersection. A modifier key changes the normal of said plane.

Instead of a modifer key, you can use mousewheel for vertical offsets. Works pretty well.

The movement speed multiplier can be based on distance from the camera.

I know the coordinate gizmo is present in all applications but think about the last game that used it.

I don’t think trying to hit those arrows is much fun. I may be biased.

But you can always implement more than 1 method.

Hi Everynone,
thanks for the input!
In the meantime I actually found a ready-made package on the Marketplace: Coordinate Axis in Blueprints - UE Marketplace

It basically adds a clone of the Move Tool in the editor, included rotation and rescaling (which I don’t need).
The price is a little steep and it requires quite a lot of work to adapt to what I need, but it’s okay.

In particular I now need to keep it always of the same size on the screen.
As I’m using very zoomed out visuals, with variable zoom set by the user (by scrolling the mousewheel), I’m inclined to try to re-scale at every Tick.
But this also sounds like the worst idea :slight_smile:

Any idea on how to keep the Move tool always of the same size on the screen? :thinking:

Alright, I’m having a few funny fails that might be nice to share :slight_smile:

So, first I tried to always rescale the ‘custom-made move tool’ every Tick.
It works fine and I can use the tool… up to a certain distance.
In fact in order to check if my pointer should interact with the Move Tool, I’m using this node below. It apparently only checks until a certain distance.
For “normal” games this is surely enough, but I’m making a very, very large space game and this is not sufficient :confused:


The final result is that the move tool is displayed and rescaled all the time, but I cannot interact with it because it constantly ends up too far from my camera, which I believe is the starting point of a raytrace.

I then thought to keep the Move Tool always in front of the camera, so I constantly re-positioned the root of the tool in front of the camera.
It does what it should, but there’s an evident delay and this move tools ends up floating funnily in front of the camera, creating only confusion :smiley:

What I really wander about it: how is the Move Tool rendered by the unreal engine in the viewport? I’d love to replicate the fact that the move tool is always identical to itself and never changes in size, regardless of where is the object which is attached to.
Any ideas? :thinking:

The distance is dictated by the player controller:


Try to stay below 2.5m units

Aha! I didn’t know about that.
Of course I immediately input several billion units :smirk: :rofl:

The issue seems to be that after a certain distance it has difficulty getting a hit result. At a distance it starts ‘flickering’ between hit and not hit.
For the scale of gameplay I need, it’s unfortunately not sufficient :confused:
I have anyway learned quite a lot by attempting this implementation :smile:

I still wonder how exactly did the Epic guys implement the Move Tool: how to make such an object stay perfectly identical on the viewport, regardless of where it’s located?
Is it located on a billboard? Is it actually a 3D UI widget? :thinking:

That’s a floating point error - a.k.a. computers are bad at math. Beyond certain ranges, you run into accuracy problems. Physics simulation becomes unstable. You can’t even spawn actors beyond 10 million units - that’s 10km. You get a 5x5x5 km cube to play in. Take it or leave it. :smiley: But…

…do have a look at world composition, (especially Origin Shifting):

Also, perhaps the spaceships don’t need to be real-sized. Maybe your 1km long star cruiser can be only 1000uus (10 meters). If everything is scaled down, no one will notice and you have so much more space to play with all of a sudden.

And that star that is 15ly away is only that far because you tell the player it is. When they lightfold-warp-hyperspace jump there, you decrease the number. It’s smoke and mirrors 99% of the time.

This will be mitigated in UE5 thanks to Large World Coordinates - a 64 bit double precision. Or so the rumours say:

Is it located on a billboard? Is it actually a 3D UI widget?

Not sure how it’s made. Sorry.

Hey, thanks a lot for the very detailed answer! :wink:
Yes, I’m familiar with the limitation and I imagined this was the issue.
I decided to not use world composition, but to use each level separately.
I also plan to move to UE5 precisely due to the double precision :slight_smile:
Hopefully it will work better and certain “hitches” will disappear.

A note regarding scaling the game to smaller units: I’m having both the ships and the command bridge which host the cameras. Meaning that the only way I found so far to make it work, is to use a consistent scale for everything, otherwise I will see the floating point precision hitches in the command bridge, where the characters are small :smiley:

I’ll keep trying to figure out the Move Tool as it’s implemented in the editor. It’s really clever and would be the perfect tool here.
Otherwise I’ll use the plan B, which is to make a UI widget and manipulate the waypoints I need to move around by using buttons. Very clunky, but would surely work :laughing:

Consider opening a new thread - maybe in Rendering (?), asking how this particular gizmo is made and how to replicate it. Tag me, I’d like to know, too!