This thread was helpful where Epic points out the fix you need to engine code to get your code for the controllers to compile and link.
We made a custom actor class for the controller which has IDs set for left and right. Then in update it reads the position and orientation of the controls and sets the actor position.
We can read the button events for the left controller via the Input system. I am still waiting on a response of how we can get the right controller.
You can get the actual Vive controller mesh from the steam resources at SteamLibrary\steamapps\common\SteamVR\resources\rendermodels\vr_controller_05_wireless_b. We render those where the controllers are (stylized our models though) to show the controller in space.
Has this been resolved I tried putting print statements onto the output of the translation of GetTrackedDevicePostionandOrientation, and I didn’t get anything back. Is there a more detailed explanation of how to get the Vive Controllers to work ? (EPIC?) Both Button and Rotation Position wise, in BP or Code ?
(Working with 4.8.3)
If you allow translation or rotation of the initial starting point to move the Vive headset away from world (0,0,0), you need to compensate for that in your controller code.
At one point we were letting the user rotate the world start position via the spawn point, but that proved not that useful so we ripped that out. Some bits of that code are still there though.
We haven’t had any problems with it once we did this however.
This is the Tick code we use for each vive controller. It sets the controller in world space relative the Player Pawn.
(Note: as I said this does not currently compensate if the initial pawn is rotated. You can support that, we just aren’t right now).
FTransform playerTrans = FTransform::Identity;
if (UWorld* World = GetWorld())
if (APlayerController* PlayerController = World->GetFirstPlayerController())
if (APawn* Pawn = PlayerController->GetPawn())
playerTrans = Pawn->GetTransform();
FVector offset = playerTrans.GetLocation();
FRotator rotation = playerTrans.GetRotation().Rotator();
if (USteamVRFunctionLibrary::GetTrackedDevicePositionAndOrientation(ControllerId, resPos, resRot))
Position = resPos + playerTrans.GetLocation();
Orientation = resRot;
I think part of the question / problem is that I’m unsure as to why the function calls to GetTrackedDevicePositionRotation, and SteamTouch1,2,3 etc… aren’t giving me any messages. Are they Broken ? Not Yet Implemented, or am I using them wrong ?
When I look at that code, I also think that I see a Boolean coming from the If(GetTrackedDevicePositionAndOrientation) function, you’re then adding resPos to the translation, and making the rotation resRot. then settin the actor, presumably the controller to that position, so is the controller a separate Actor Class ? I thought I had to have mine in the Character Actor, so that I could communicate functionality back to the Character Class and have it interact. I was thinking it would be similar to the Leap Integration…
We created both the controllers and separate custom actors. We want to send messages to them and they have custom animation and collision spheres and such so it made sense for us. The controllers do register with the player pawn so it is aware of them, but they are separate actors.
The code you see sets the actor location for the controller if it gets info from the tracked device. But it does offset this based on the pawn location. The pawn location on the Vive is the center of the Vive world. The head is a position offset of that.