VR Expansion Plugin

I answered your direct visitor message about this already if you want to go read the response.

Technically the sessions stuff is totally separate from the VR Plugin, which would be why there is no mention of it in the plugin documentation. I just package my Advanced Sessions plugin in with the template to handle the steam sessions testing in the example project.

Ah yep. I knew the sessions stuff was handled by your other plugin, forgot that example project is really separate from plugin though.

Pushed a fix for throwing with SetAndGripAt center of mass setting with physics grips (the default one)

Patch Notes Here: 4.22 Patch Notes – VR Expansion Plugin

Hey man!
Would you know how to do an Echo Arena grip style on a disc with your grip scripts?

Hey man!
Would you know if there was a way to do an Echo Arena grip style on a disc with your grip scripts?

Looked through the docs, the install, etc. Added in a VRPlayer (tried Base, Character, Simple) and nothing happens. No controls, not tracked controllers, no movement, no nothing. 4.22, using the 4.22 binaries, compiled in VS just fine, auto-posessing player 0, getting nothing.

With grip scripts? No? That is an animation procedure. You could make a grip script that traces the held object and passes the information out to an animation instance and that would work, but you won’t be directly posing bones in a grip script using a standard skeletal mesh.

It also wouldn’t be best in a grip script in the first place, you can bind to OnGrippedObject on the controller itself and handle logic based on the position of the hand relative to the object if you want a fully dynamic grip. If you HAD to have it in a grip script I suppose that you could make an “animation Pose” script, get the list of scripts OnGrip on the controller side and find the first animation Pose one, then retrieve the animation or data from it.

You could also run default poses and blend between them and handle it with an interface or any other method.

Is it possible to broadcast a video stream from CameraRecorder blueprint to twitch ?

like record your screen? yes?

no, in VR game my character will have a camera in his arm, from this camera I want to broadcast video to twitch when I want to broadcast, and stop broadcasting accordingly.

Then you would have to hook up with an OBS plugin or a twitch streaming plugin or hotkeys on OBS to manage it…

Not entirely sure what you want from me here, I don’t think you understand what you are requesting.

Sorry, probably I don’t undrerstand what I am requesting. But I need what I sad - player in VR must be able to use camera in his hand to broadcast a virtual world to twitch. If you can suggest me the way how to do it, it will be very helpful. If your previos post is a solution, please, give me some links to read.

By the Echo Arena grip style I just mean sorta how the disc grips and snaps the location where you grab it and not on a set point yet still keeps its rotation correct, like what is shown in the attached clip.

Yes, in the GetClosestSocketInRange function pass out a transform that is relative to the hand current transform to the object, but is centered on the object.

Where might I find “GetClosestSocketInRange”?

In the grippable object, it is a GripInterface function that you can override and pass out anything from. It lets you override the default behavior of finding the closest correct prefix socket and instead pass out any transform that you want.

Would you be talking of “Closest Grip Slot in Range” or am I still looking in the wrong place?

Yeah that one, you can return “true” for found one, and any transform that you like (in world space). You can override it per object to return different results (like component locations instead of sockets, ect).

In this case it would be a projected location to your object based on the hand position relative to it.

You could also do all of this in the character itself, but it sounds like it is a one off thing?

First of all, amazing, amazing work you’re doing. Like you’ve mentioned in your video’s I am having a tougher time getting started than I did with the SteamVR plugin for Unity but I know it will be worth it once I get through this. This is being touched on right now so I wanted to chime in to also mention that I would love to implement something similar to the skeleton poser in the steamVR plugin on the Unity side as I find it a major immersion booster and a lot of fun to do.

So this could be a combination of looking at the current transform of the object you’re trying to grab and passing that on to the hand socket to make it transition more smoothly? In the Echo Arena video it seems like its snapping to the hand but maintaining its own rotation on the horizontal axis but taking over some rotation from the hand. And then using an animated pose to match the transform of the object? I’m not exactly sure how I’ll work make those two play nice together but I’m just trying to understand the basic theory behind implementing this.

I’ll be donating to the patreon as soon as I have a little spare cash as I really think its amazing what you’ve done over these last few years.

The SteamVR plugin over at unity has a bunch of default poses that it blends between on the animation side. IE: a default hand pose for gripping each object, then it blends in the current finger Curl values of the Index controllers with an open hand pose. When no object is held I believe that it just uses the hard finger skeletal transforms instead.

Their poser doesn’t “live solve” geometry or deal with any position poses (as far as I know its all snap grip based), it just blends pose states. Though I do think they have a manual poser in it so you can create poses in engine and save them out for use.

In UE4 that is fairly simple to achieve, you can pass out a “grip pose” animation from the object and apply it in the AnimBP. The reason I don’t have an example of this in the template is because I honestly just don’t have any hand animations to use for it. I was considering adding an interface query to get the suggested pose from the gripped object, but its something I have mostly felt was better handled on the users end as they may end up wanting it very complex and any default implementation may not be enough information.

Letting the end user poll for the requested animation pose themselves with a custom interface is likely better in the end.

Yuri I believe has a hand posing plugin available on the marketplace (VR Hands Solver), I haven’t tried it myself so I won’t vouch for it but you can look at it and see if it meets your needs for that part as well. It uses a custom skeletal mesh though which means it will be a bit incompatible with some things.