That can happened to me as well at first, and it depends on the orientation of the real world tracker vs the orientation of the 3d model you’re using inside UE4.
There are two ways to solve this issue.
First download the 3d model of the tracker from the Vive website, import into Maya/Max/whatever, then position the 3d model so that the lower circular part is at the origin of the axis, and also in a way where you’re able to see its orientation, and the small led indicator on one of the “horns” will help you.
Then import the file into UE4 and use the tracker as the mesh in VR…in short you need to check the orientation of the virtual Trackers 3d models while you’re holding the real tracker in your hand.
If the orientation is ok you’re good to go, but if they don’t match you can visually figure out how you need to rotate the tracker in order to match the orientation.
The alternative method is to use a scene component as the “father” of the 3d object you’re using the tracker with, so that you’ll drive the scene component in VR, but you’re able to freely orient the child ( the mesh you use as the bat ).
If you stop the tutorial on 1.41 seconds, you can see the on the Components Tab I have Tracker1 and Tracker2 ( should be L and R ) and I attached on each one the hands: in your case you should add a scene component, then parent the Tracker mesh ( your bat ) under it, and use it to re-orient the 3d model so that it’ll behave according to the real life movement you’re doing.
If you have troubles let me know