VR Expansion Plugin

Hi,
i am struggleing with getting buttons to work with remote objects, in my example a door in another BP. I am trying with one of the Buttons that is already in the demo scene which has the VRButtonComponent already set up. Then i have a door BP for which i added the “Electrical Object” interface under “Class Settings” -> “Implemented Interfaces”. I also added the “Event Current State Change” to the door’s Event Graph. So as far as i understand, i should now be able to add the door object (placed in the level) to the “Electrical Targets” array in the Button object (also placed in the level):


However the slot won’t accept the door object as “Electrical Object” for some reason.

I also checked the example in the demo level where the Button object is connected to the MirrorActor object.

I can’t see the difference between setup and the one i have with my door, since the only thing the MirrorActor does is to implement the “Electrical Object” interface, but obviously i am missing something here. Strangely, if i add a new MirrorActor to the level (which should be the exact same thing as the one already put in the level by you) then newly placed MirrorActor will also be NOT accepted as electrical objects by the Button objects. So do you have any hints on why would be or even better s short explanation on how to set up correctly?
Cheers!

its working fine for me, adding the interface to the gun in the level and implementing the function, then adding a new element in the buttons array and selecting the gun.

Keep in mind that whole electrical thing is just part of that one buttons blueprint, its not related to the plugins buttons. Its a construct added to show something useful that you could do with a button.

is rly strange. I tried and found, that i can add the gun object “as is” to the button array even without adding the interface “Electrical Object” to the class settings of GunBase BP, nor the event “EventCurrentStateChange”, nor the function “CurrentStateChange” to it’s event graph…

So let’s try with an empty virgin BP:

  1. Create BP of type “Actor” (BP_Test) and add the interface

  2. Add the event to the event graph

  1. Place BP_Test in level and try to add it to the Button array –> Fail.

Would you try to reproduce that?

Edit: In the meantime i found that most of the actors that are lready in the demo scene, e.g. the Barrel, the Door, the gripable cubes, … can be added to the button’s electric objects array “as is”. No need to add the interface or function whatsoever. However creating my own actor, putting it into the level and trying to add tht to the array wont work. What makes those actors already placed in the demo scene different from mine?

Well yeah…its just a tarray of object references that it tries to call the interface on. If they don’t implement the interface than it just won’t do anything.

Try compiling and saving your edited level, there is no difference, it is just the editors actor selection ui.

Thx! I saw that, however, since it didn’t work i thought you might have implemented something under the hood that’d make sure only an object of a certain class which also has the interface implemented might be accepted into the array slots. Quite new to Unreal and just made the move from Unity, so excuse my lack of knowledge. That said i found the problem. It didn’t work because i dropped the objects on a level layer different from the button objects. That’s why the editor wouldn’t accept a reference to one of those newly created objects… Note to self: Never forget to set your working level layers after reopening the editor. So thx again and sorry for my stupidity. Cheers.

Maybe I got it
Added to the Throttle child class…

Hi - is probably something simple but I am stumped after a day of trying to work out - I have created some interactive buttons using the plugin that are working great however they only seem to take collision and movement interaction off the main (palm section) of the hand mesh and the grip collision sphere attached to the hand mesh. It cannot push the button with the fingers themselves - so if you were to index finger push a button, the finger collides and is movement blocked by the button but it does not move the button itself. I have tried adjusting collision settings on the hand mesh and also tried creating a small collision sphere attached to the hand mesh on the Vive pawn character blueprint - but these seem to disappear/have no affect on playing the game.

Can someone explain how I can get some sort of collision on the fingertips/fingers of the hand between hand/button? Ideally in fairly beginner terms. Thanks in advance!

There are two options for that, since I don’t know what everyone will want I left it open ended.

  1. You can turn on bSkipOverlapFiltering in the properties which will make it just use collision settings for pressing.

  2. Or you can override the IsValidOverlap function and filter however you want to replace the default functionality of things attached to the controllers or held by them.

Default functionality is there to prevent body parts from just pressing buttons all over generally, but obviously yes in cases like yours you would want to change the defaults.
Both options can be done in c++ or BP.


 // Skips filtering overlaps on the button and lets you manage it yourself,  is the alternative to overriding IsValidOverlap
UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "VRButtonComponent")
bool bSkipOverlapFiltering;

UFUNCTION(BlueprintNativeEvent, BlueprintCallable, Category = "VRButtonComponent")
bool IsValidOverlap(UPrimitiveComponent * OverlapComponent);

hi. .

I want to make a more advanced version of the vr physics body.
The current plug-in supports body physics and gravity, but I would like to add a hand here.
For example, I want to make physics climb possible with just my hands, but I have difficulty developing it.

There is a video that shows the ideal result I think. Can I get some advice?

[quote=“CokeKuma, post:4096, topic:68709”]

hi. .

I want to make a more advanced version of the vr physics body.
The current plug-in supports body physics and gravity, but I would like to add a hand here.
For example, I want to make physics climb possible with just my hands, but I have difficulty developing it.

There is a video that shows the ideal result I think. Can I get some advice?Physic Movement | VR Game Development | Into the Darkness | Unreal Engine 4 - YouTube

In tests those guys were hiding invisible colliding bodies that are constrained to the controllers, then offsetting the pawn based on how far away the body is from its target location on collision, that is probably the easiest method to conceptualize how to go about doing it, then you can move to more stable iterations.

Its not a difficult concept to achieve if you think about it. Can do it with locking on to surfaces and reading positional differences as well (their hands lock on to the top of the surface when starting to climb, then they read the positional difference in that video and offset the player, the arms are not physics driven), or running fully simulated arms and reading wrist to controller differences, or for that matter, from held objects being stuck on things and reading where the hand is and where it should be.

If you really wanted to get involved you could also make the player based on an actually simulating body that is forced upright and use constraints from the hands to move it around directly when against objects, somewhat like lone echo.

Also the plugin doesn’t support “physics body / gravity”, its a character controller, all interactions are faked. Its not actually simulating on the physics scene, just being moved around in it as a kinematic actor.

Hi, thanks for - so I turned on the SkipOverlapfiltering option in my buttons and I get some strange behaviour that I cannot work out. If I start my game and try and push a button with my fingertip, the buttons now move but freeze at what (I think) is the depress distance and will not move/interact again. However if I start a game, hit the buttons with the Grab Sphere part of the hand mesh first, the fingertip collision and buttons work perfectly/repeatedly afterwards?

Ideally I only want the fingertips to be able to interact with any of my buttons but that is currently beyond my blueprints knowledge! (still learning)

EDIT**** Would’nt you just know it - 2 minutes after posting (after a day of trying) I worked it out - I had collision events set to on for a button shroud I had underneath the pushable button object which was interfering with the collision. Turned off collision and it works fine now. Will leave comment in case it helps someone else! - Buttons not working as expected - check your collisions!

sorry if has been asked before, but does the openVR expansion plugin support replicating hand tracking over network?
either with the UE Oculus Branch or the plugin from fsheffer?

It does so with the openvr index tracking, currently there is no extension to that for oculus as i’m waiting to see if they adopt the openXR microsoft hand skeleton extension, which would be ideal as all offerings would be rolled into one api then.

Hey !

I’m looking to add some grip strength interactions to some objects, for anyone that has controller capable of reading as such. I have no issues getting the values, my question is - what would be the best way to read the values in the object? My first inclination was to add a float variable to GripMotionControllerComponent.h and update it whenever I read an axis value - works perfectly, and I’d imagine it would be perfectly fine to read in TickGrip, but my only issue with using TickGrip is that in order to do so it seems I’d need to always use custom grip collision types, and thus lose the lovely base interactions. is more of a minor inconvenience than anything, but an inconvenience nonetheless. :slight_smile:

Since I can’t rely on users having a grip axis, it’s not going to be anything gameplay-affecting, just nice little easter eggs / alternate input for world interactions - so I imagine it doesn’t need to be replicated (as long as the resulting actions, if needed to be, are.) I’d love a flag of some kind in the grip interface, default false, that allows me to signify that it should always tick the grip - or potentially an event that can be triggered against interface objects like an FVROnUpdateGripStrength_OneParam(float, GripStrength) Of course, if there’s an effective way to do now, please let me know!

Thanks a ton!

Thx

Your plugin prevented me from a lot of struggle while setting up a network for oculus quest with replicating HMDs. :slight_smile:

Right now I am replicating controllers and I would like to replace them with Hands, but I am a bit lost.

At least for me as novoice unreal user it’s pretty hard navigate to the version/feature/plugin/sdk maze - and to make the basis work. :slight_smile:

I was hoping you (or some elsemelse :slight_smile: could point me into the right direction for the least painful way to set up.

I upgraded to UE 4.25 because I thought it generally would be better for handtracking… But there it starts with chosing between the Oculus branch (with native support for hand tracking) or the official release + the plugin from fsheffer.

​​​​​​And on top of that, both 4.25 (official and oculus branch) lead to an immediate crash when I launch the app on oculus quest… (Still trying to figure out why…)

If I understand your answer correctly it works with your plugin, but right now I am not sure what else is needed. And would it be also possible to stay on 4.24 which for me works more stable at the moment…?

Thanks

The object can poll the controller itself from tick or timer, or you could update to the held object from the controller, either works, no need for CustomTick at all.
If you place it in your character your animation blueprint can sampling holding character and use it or whatever.

Hah, here I was trying to find something complicated, the simple solution was the best one. I’ll enable tick on the object when it gets held, then disable when it’s dropped, and just read in the regular old tick. Thank you!

The answer was very helpful. Some of the features shown in the video were reproduced.
But one more feature seems to be needed. Can gravity simply be added to the climbing mode provided by the plugin?

Add a constant downward force, that is all that gravity is, your issue is that you would have to account for it to avoid wobbling.

I’ll also note that in the very latest patch, ALL movement modes accept the custom input, including falling.

Not sure what’s wrong here: I set up a VR Pawn with replicated motion controllers, two quest controllers and the oculus hmd, represented by the UE Matinee camera.

  1. I added a static mesh as child of the replicated camera component and copy/pasted the matinee camera from the camera component.
  2. when I change the static mesh to another mesh, ALL visual representations of my motion controllers disappear in my own VR view and also for the other player in the network.

Am I doing totally wrong which breaks everything or is a bug?

Instead of the static mesh component I also tried an “optional rep static mesh” but that didn’t help…