[VIVE]matching VR scene to physical model

Hi,
I have a scene and it matches a room in my house, how do I align the vive so it matches the actual position in both VR and reality.

cheers,

HI,

you need to get the room setup for the Vive as precise as possible and you will get a good result.

Hi, Is there a way to set the origin and rotation through the controller for example of use the keyboard to match the physical mock up? The scene matches the mock up but I need a way so that when the user sits down they are actually sitting in the seat

One technique I often use is to place a marker (say a Scene Component within a Pawn or a Target Point within a level) then calculate the difference between the marker world position and the VR Camera world position. Apply the difference vector to the VR Origin through an Add World Offset and that will take the VR Camera exactly where the marker is (check Teleport to just move there without other side effects). If you want to maintain the height of the user in VR, you can just project the difference vector on the XY plane and do a pure horizontal movement.

Edit: by the way, this method maintains the rotation of the VR Camera. To also alter the rotation of the camera and align it to a given direction you need a bit more math (roto-translation of the VR Origin). To first align with the play area and then move the VR Camera in place, you can call Reset Orientation and Position before the move.

Thanks for the info, is there a way I can point the motion controller to two or three points in the scene, one at a time and match each point to the real objects in the world. For example, have four points of a table in unreal and match them four points to the real edges of a table that are the same dimensions to rotate the scene to match the physical mock up.

Assuming the center of the room already matches, in principle one point is enough to establish a roto-translation and another one to also establish a scaling factor if needed. I have to think about it… Also I don’t really use room scale, so I cannot test it for you. There was a similar question some time ago which was somehow related:

https://forums.unrealengine.com/development-discussion/vr-ar-development/1362635-4-17-vr-rotation

Look at the solution there, maybe it helps you.

Anyway, what happens now? Why do you need such a complicated system? Isn’t the regular calibration/centering enough? Or am I missing something?

Hello,

Sorry for reviving this thread,
but i struggling to make vr pawn camera (from vr template) to be at exact same place every time when i set world location for it or for VrOrigin scene component.

What i am trying to do is on BeginPlay to set world location for VRorigin or camera ( i tried for both and the result is same) and get HMD orientation and position> break vector from it then make vector and subtract it from world location of the placeholder object where i want player to be spawned or teleported every time at begin play for example.

The problem is that i always get strange offset because my hmd is not at the center of the room setup. Is it possible to set the location of the camera every time at exact same location no matter where my HMD is positioned in room setup? Because i need more precise spawn in tiny place, and sometimes i get spawned in a wall for example or far away from actual spawn location.

I’ve tried what @vr_marco but i am struggling to do the calculation part and applying the difference vector to the VR Origin with Add World Offset.

Can you guys can explain to me how to do it more in details, please.

Regards!