Can I access the grabbed Actor on the HoloLens?

For my HoloLens app, I try to access the grabbed / selected actor. I know I can access it by using the Actor Hit result from a line trace by channel node. However, I cannot get the line trace on my hand aiming (far beam). How can I do this? Or is there a simpeler way to access the other Actor?

Thanks in advance.

PS: I found an example for a trace line in the documentation (Hand tracking in Unreal - Mixed Reality | Microsoft Docs), but this is created for interacting with a widget and I want to interact with all kind of objects. So I dont know how to implement this. Furthermore, the AimPosition gives a world location that is not the same as the current location of the left hand controller.

Silly question but have you set the draw debug type to something like one frame and verified that the line trace is not doing what you want? Have you attempted to print out what the aimposition is? Does it look reasonable relative to the world location of the actual motion controller?

Barring that, I would say rather than using the “get motion controller data” just use the actual motion controller component…hook it up just like you have it by using the world location as the start and multiplying the forward vector by some distance for the end. I’ve had issues in the past using “get motion controller data”. It’s unclear that is available for all setups.

1 Like

Thanks for your reply! I have set the debug type to duration and I do see the line trace while playing the scene. If I print the location and rotation, the values will always be 0. I also don’t see the transform values change in the details window. Furthermore, I don’t think the HandInteractionActors (which show the XR pointers and cursor ring) have a motion controller, because I don’t see the component in the details window. Maybe that is why it always gives values of 0? And I cannot add a component because it is a C++ script.

I can get the transform data of my XR simulation hands. However, this rotation is always different from the pointers, so I don’t think that is usable.

Since I don’t know how to access the grabbed object directly, or how to spawn a line trace in the exact position and rotation of the hand pointers, I decided to make all necessary components interactable by the generic manipulator component. From there I can create a blueprint that performs my functions. I hope this will not have a too large impact on the performance.

The only downside for now is that I have to give permission to delete a reference for all actors (hundreds of actors) when I convert the actor type from Static Mesh Actor to a custom actor that contains the static mesh and the manipulator component. (EDIT: Using “Replace selected Actors with” will keep the references, this means no permissions are necessary anymore.)

There are two positions and rotations that the OpenXR runtime will return, and Grip Position is equal to Motion Controller Location. AimPosition is a point on each controller that’s somewhere around the top of it, and a good position for logic such as teleport trace. Both positions are used in the VR Template (VRPawn) if you want something to reference.