VR Expansion Plugin

Yes, is there bug or something. But now know what happens.

The system, works as you said. But my mistake is the next:

When i saw your Q&A Video on YouTube where you explain the Angular Stiffnes and Damping. I started to try functionality tweaking the values. When i set it to high values as you, i got the oposite effect what i want. But i now know what is the problem. If you set the Angular Stiffness and Damping in the BP settings it work as i said. But if you change during play, low values don’t apply a physical delay in movement, and high values do the delay.

My mistake is that when i tested , the value as 100 or more not change anything. And values below that 20 apply a physical delay.

Nah, it was just using the passed in grip struct instead of the found one for that node.

I fixed it, and will upload today.

I modified the change log post above to include them

Edit Patch released with fix

Thanks , I love you so much! Downloading!

Tested and works as intended. Thanks again!


Recycling

I’m trying to rework one of my systems, I’m don’t satisfied with it.

I Know that in Secondary Grip we can Smooth it with SecondaryGripScaler. But i want much more that it. I want to Smooth the rotation about 2 or 3 times. By example, if i rotate 10 degree the the gripped object only move 5 degrees (multiplier 2) or 2.5 degrees (multiplier 4) .

But i don’t know how you calculate the rotation between two Grips. Obviusly, the rotation with one Grip is the Controller Hand Rotation. But with second Grip, i don’t know how your plugin do it. I supose that calculates the LookingRotation between Grip1 Location and Grip2 location. But i don’t sure that.

How i can achieve that?

Thank you in advance.

You’ll have to override the default logic for that, a functionality that is coming around the 4.20 update when I make some gripping changes. Until then it would have to be a custom grip that you run the logic for yourself. Originally the smoothing worked kind of like that, but it felt really bad for most uses.

You can edit the 1 Euro Low Pass Filter that the secondary smoothing uses though and add a really large smooth to it, which will significantly delay it, the ProjectSettings -> VRExpansionPluginSettings has the relevant settings for that.

Also no, I don’t use LookAt, I don’t like how it handles the rotation, I map hand location differences around the held object and use the Quaternion rotation from vector.

Yep, i know the EuroPass Filter option, but my problem is i don’t want settings in all grippable object. Only in a determinate object and situation. And the EuroPass settings isn’t variable through a BP, no?

I try hack with the AdditionTransform to compensate, compensating the rotation with addition trasnform. I think can work. Quaternion rotation from vector is not exposed to BP, VRExpansion, GripInfo or something expose ?

And it is all, i don’t disturb you more. Thanks anyway, you rock!

Hi, Is there a way to move a skeletal mesh actor at the nearest bone, or at the gripping location without enabling physics movement? I’ve made a child blueprint class from the GrippableSkeletonMeshActor class and added the “Interactible.PerBoneGripping” gameplay tag. Changing the grip type allows gripping at the bone location with physics movement, but grip types without physics movement grip at the mesh root, with an odd rotation. Thanks for your time!

Changing it to per bone gripping passes in the transform of the bone itself, which is why you are getting a different rotation with a non simulating grip that doesn’t actually target the specific bone.
Also no, if the bone isn’t simulating then it retains its animation pose at all times…You can’t even move a bone and then unsimulate and have it hold its location.

You would want a poseable mesh for that or to feed in the altered position to the animation graph for the TransformBone node.

The easiest would be to use a poseable mesh with a custom grip type where you change the position via SetBoneTransformByName, the problem with poseable meshes is that they don’t support physx assets / collision and you’d have to figure out how you would want to define a grip and where to get the hovered bone from.

I am not personally aware of any better solutions outside of a custom skeletal component where you change the behavior manually in c++.

Also I have some commented out code that originally only simulated the body being picked up, however since its pointless currently as skeletal meshes snap back into place the current version simulates all bones on a simulated grip.

Edit Just in case since you worded your question a bit weird…if its the case where you just wanted the grip to be held at a specific point of the mesh and you aren’t trying to move a single bone, turn off per bone gripping, it will always move non simulating grips from the root (its the only way), but it offsets to hold the same relative to controller position.

How hard would it be to make a lever that is held using the grip button instead of the trigger? I’ve been pouring over it for a while now and I can’t see how it would be done.

That was what I wanted, thanks!

In the template? Change the gameplay tag for GripType to “SideGrip” instead of “Trigger” in the objects properties.

Ahh I was trying to edit the steering wheel and that has none of the tags set so I was a bit confused. Works great thanks!!

Video going over the controller profiles and a recent addition to them (I am going to be making a few videos weekend).

Hi , your plugin is the best thing since sliced bread! I’ve been an avid user on Vive for a long time. Truly awesome work man.
Have you tried out the AR template on an iPad? It works surprisingly well.
I’m working mainly on AR projects now (there is big demand) and it would be out of world if it was possible to use your plugin. Interaction would work with motion controllers connected to the iPad like the FPS_Pawn say…
What are the issues and difficulties that would be faced attempting to get the plugin working in Unreal AR? It would be a great opportunity to bring the plugin to the Apple and Google AR app world. I think it would be very popular and I don’t think the work required to support it would be too great.
The network capabilities would be useful once shared global markers are supported in ARKit/Core in future.
What do you think? I’m prepared to do the work myself (probably badly) if you could give me some general advice, but you’re the guy for the job man! Any questions please ask.

Thanks muchly!

Is there a suggested method of turning off the movement mode switching and text displayed on the motion controllers? I’ve been playing around with for a few days and got everything working as I want, but after disabling various likely bits and pieces I still haven’t found where the text is actually drawn.

Also I have deleted the red highlighted section that sets the motion controller skin in character event graph but my controllers are still being drawn instead of the animated hands, is there something else I need to do?

I don’t intend to get involved in AR to much currently as I don’t have the hardware, it is also pretty close to its infancy in implementation right now.

That being said, assuming the motion tracked controllers follow the standard interface that ue4 uses (and any made for AR should), it should work seamlessly, obviously the HMD part of the characters is no longer needed, but the motion controllers would be fine and could be used as is.

The red highlighted area is the only part that loads them, if they are still loading then you likely have Epics new loading system in 4.19 turned on (DevicedisplayModel in options of motion controllers). As far as movement, the Gripbutton events toggle, just delete the SwitchMovementModes function from it, and deleting the text from the controllers will give you the one spot it is used, a “WriteToLog” function from back before I had actual logs implemeneted.

Pushed a new commit to the repository to go along with the Controller Profiles video I uploaded last night.



Adding in bOffsetByControllerProfile to the motion controller.

If true, it will offset the controllers by the currently loaded profiles
OffsetTransform.

Throws the event OnControllerProfileTransformChanged to the controllers
so that things like procedural meshes can be offset to account for it (proc
meshes load already aligned to the controller type, so they need to be adjusted).


 makes it far easier to add and use controller profiles, besides  the procedural
mesh re-alignment (which most don't use anyway) it will be fairly seamless.

The example template has example profiles now and implementations of all of .

(also thanks to SlimeQ for the idea, I had doubts due to having to think about what special cases would need to be handled, but it was fairly straight forward.)

.

The HMD for AR is the tracked tablet/phone position and orientation. Lock to HMD for the Pawn camera works so I think VR character should be usable. I’m going to enable the plugin on an iOS AR project and see what comes up. There is no SteamVR etc, can I easily disable anything dependant on that?

Many Thanks

Thanks, it turns out I’m a idiot and was working on the simple base vr character instead of the Vive one.

The OpenVR stuff is a seperate module that can be turned off, but it also compiles out any specific code if on a platform that doesn’t support openVR anyway.

Hi! i have a problem with objects that i grip… when i grip something and i move my character the object don’t stay steady on my hand but have a sort of “lag”… if i shake my hand quickly the object fly away. How can i fix ?
Thank you for all the work on plugin, is awesome!