VR Expansion Plugin

Thanks so much!
so Add a follow camera or copy the 3rdPersonCharacter class content from the template are both OK?
Make a follow camera is better and more powerful? add one more Camera? or reuse the VRReplicatedCamera?
is spring arm enough? if custom script, is there some tutorial?

Hello,

I need help setting up PrimaryGrip for custom hand mesh. I have a bundle from Marketplace Asset “Hands for VR: Basic”, there are working fine, but the problem is when I changed PrimaryGrip in GunBase. I swapped hand Target Animation (I tried both default and grip versions) and Visualization Mesh, but all time the default grip animation just shows when I pick up the weapon.

What else do I need to change?

Grip is working fine on default mannequin hands.

You also need to use an animation BP that gets the hand pose. Those hand socket components output a pose when queried but they don’t just magically apply it to a hand, the animation blueprint has to apply it.

Thanks for the answer. I’ve spent the last few hours checking and comparing blueprints. My Animation Blueprint is almost identical to RightHand_AnimBP (actually I copied the logic and settings from it directly based on the tutorial Hands for VR: Basic - Unreal Engine 4.13 VR Template - YouTube)
Everything about grasping objects works OK (e.g. for PickupCubes), only poses won’t work.
The only difference is in AnimGraph - in the example it is set to play the Right_Grab sequence, when I use a similar sequence in my blueprint, the hand stops animating at all and I have to go back to setting Blendspace like :

Imgur

I really have no idea what else I can make to get it work.

Look at the GraspingHandsAnimBP and read section of the website: VR Expansion Plugin Documentation – VR Expansion Plugin

The grasping hands in the example are querying the hand sockets to get the requested post (if there is one) and then applying it in the animation bp. You are comparing the epic animbp, not the one in use by default in the template on the spawned in hands.

how to enable NPC walking even onto the VR hands, and the NPC can be physical(eg. pulse away while the hands waving hard)

I have a pawn setup for VR with the following hierarchy. SphereCollision->CamNeck(SceneComp)->Camera. They all sit on top of each other at a completely zeroed out location. The sphere collision is just a sphere that barely surrounds the VR users head. It always stays on top of the camera - even when the camera moves around in VR I adjust everything to stay together in the Pawns tick. The problem is that when I launch the game in VR Preview I can see the mirror pop up window everything looks good, but the moment I enter VR the VR system takes over and moves my camera away from the parent components. It gets moved exactly the distance my head is from the VR playspace floor/ center. So The VR system is forcing my root component (SphereCollision) to act as the floor, and it pushes the camera up and away to mirror the real world head to floor center difference. Now I’ve tried all the normal stuff

tmpTrackSys->SetTrackingOrigin(EHMDTrackingOrigin::Eye);

I’ve also looked into having a notification occur when the headset if put on - but VRNotifications Delegates are not actually hooked up under the hood of UE4. Binding to them does nothing.

plugin looks great but I already have all the other VR interactions built so I don’t need all of that. Additionally my pawn is very specific to the game I’m building. Can I ask how you would accomplish resolving issue - seems like somewhere in plugin there may be a way to catch the entering VR situation,

Tracking level of eye will still allow offsets upwards, the starting position of the HMD would be the zero point unless you re-center everything during play, its never going to “lock” the camera in place.

For that you would either have to offset the cameras root by the cameras relative position or turn off the bool on the camera to FollowHMD (don’t suggest doing but its an option).

what’s the API in C++ & Blueprint for the VR hands overlap objects(eg. a Character with physics or Ragdoll),
eg. how to make a walking character become a Ragdoll while the VR hands touch it.

AGrippableCharacter, a blueprint class inherite from AGrippableCharacter. also copy the EventGrip from blueprint of GrippableManniquin. Then set “simulation generates hit events” & “generate overlap events” to true, GrippableManniquin is able to be grabbed with fingers touch on the skeletonBody’s surface, BUT AGrippableCharacter is not able to…anyone met problem or know how to fix ? Thanks!

Don’t have the mesh collide with the capsule when it ragdolls, has to ignore the pawn channel or the capsules collision needs to be removed when its ragdolled. Otherwise its going to fling out of the grip.

Other than that, they work the same, might want to check the grip settings.

List of recent fixes / changes live on the 4.26+ repos, including some new hand socket component stuff:

https://vreue4.com/4-26-patch-notes?section=misc-fixes-and-cleanup-06-28-2021

List is a little light as I am iterating on a OpenXRExpansion branch and have an upcoming week off of work where I plan on handling the big tasks.

thanks! dynamically remove/add CapsuleComponent? or just dynamically set all the collison of the CapsuleComponent to false? now for test i’m using blueprint too…seems harder to do in blueprint than c++

how to make VR hands(Player Character) won’t be pushed back while NPC running to Player, while Player hands are physical.
set Gravity to a very big value? or better way?

is it able to get VR hands’ velocity, directoin? and then based on these value to convert them to a Force value?

You would have to manage the npcs physics collision and mass settings.

As for velocity and direction, sampling the hands position frame to frame or (since they are simulating) directly querying for the PhysicsVelocity node will work.

ok, thanks. so normally donnot change the VR hands mass setting and related settings?

ok, i thought there is already existed in the VR glasses’ SDK. if not, so directly querying for the PhysicsVelocity seems better, which Actor is node belong to ? the Vive_Character or the hands?

Well you specified the physical hands velocity, which is entirely different than the motion controllers themselves. I fill out the motion controllers component velocity, but the “hands” are based on constraint strengths and collision so they will have different values.

Hello,

We’re using your plugin in our multiplayer VR game and we’re very happy with it. Thanks for the great job !

I have a question though :

How can I stop having the capsule of the simulated proxies from rotating when the associated player moves his HMD?
I would instead move the head of the skeletal mesh to match the HMD rotation, since we handle the actor rotation ourselves.

EDIT: While I’m at it, what advice can you give us related to the scale of the character? For now, when I leave my arms pending, the controllers are resting on my legs roughly between my knees and my hips. But in the game, they’re more at where my calves are.

Thanks !