RC is “remote control”. A basic object that can be attached/detached to areas.
As per the trace channels, my box collisions have none of those options on the template. Do I need to enable something to view them?
(Also, neat to see you over here. I remember your KSP cinematics, they were awesome. You did my B9 aerospace rocket as a picture, and I still have some of em printed out around here…)
Hmm, are you using the template project as a base for yours or just the plugin into your own?. if its the latter you probably need to defer to since they arnt showing up.
You can find and add custom collision responses in " project settings-> engine-> collision"
Thanks, surprising to run into you again, that was what… 6 or 7 years ago
It’s the template project, but I’m pretty sure I somehow screwed up the project settings, awhile ago I lost the ability to grab objects in non-VR mode, had to re-import inputs. I’ll go look at my backup. EDIT: Looked at the backup, yup, it shows up properly there. Not sure why.
One thing I’m not quite sure of is how to setup the custom tracing, though.
Rebuilding the SLN doesn’t do anything, its an engine issue so unless you are on a source copy of the engine then you need to get a version of the engine with the fix (IE: 4.21+).
If you ARE on a source version then you can pull that fix commit and patch your 4.20 version.
List of recent bug fixes / changes, live on Master branch of the repository.
Now caching the secondary attachment pointer, lets me
make sure that the server gets a valid pointer on release of secondary grips, also lets me throw drop events
when the secondary attachment changed but its still a secondary.
Changed to sending the full transform on drop with client auth grips instead of just position / rotation.
Fixed a Z axis bug introduced with the lever angle re-write
Moved EVRInteractibleAxis to the VRInteractibleFunctionLibrary header,
it belongs there instead.
Switched levers over to use the Interactible library functions
Also made the dial use the same basic logic as the lever for rotation,
with an option to revert back to direct hand sampling.
Added new grip type "EventsOnly", will skip ALL extra logic of grips and only store the grip
data and fire the events for it. Whether you do anything else with it or not is up to you.
Like a CustomGrip except even more barebones as it does not fire TickGrip or obey any of the
dropping simulation logic, it is also auto excluded from late updates.
Stopped calling "StopSimulating" on Custom Gripped objects automatically.
It is legal to start and keep a custom grip simulating, that call should have
never been there. Anyone who needs their custom grips to auto stop simulating on grip
will need to call it in the OnGrip event themselves.
How should I go about implementing the OpenVR skybox override for a loading screen? I’ve tried using your blueprint nodes but it’s still giving me the default SteamVR skybox between levels. Are there any further instructions for setting up?
I thought to ask here, since all VR junkies are in thread ;).
I’m having some trouble creating a smooth elevator in VR. Basically, the environment ‘shakes’ a lot for the user as soon as the elevator is moving. I think is caused by the elevator floor moving, and the VR character ‘floor offset’ not updating every frame (or being delayed by the NavMesh updating?).
Does anyone have ideas on how to improve on ? Should I adjust actor location together with the elevator maybe?
You may want to move the elevator before the character moves, its likely that it is in the same tick group and the ordering is random so sometimes it moves before the chracter and sometimes it moves after the character.
Adding a tick pre-res for the player to the elevator should work…though normally a dynamic base like that should already be doing that.
I put it inside of Epics MotionController_BP actor since I had moved the controllers out of there. I didn’t want to put all of those components into the primary character blueprint.
I didn’t manage to get an improvement out of adding pre-res… But eventually I did find the cause: I was using ‘MoveComponentTo’ (and moving the root component so that the rest would follow). It turns out doesn’t play nicely. Using a TimeLineComponenent and Lerp SetActorLocation solved my issue.
So, another day another question: Using IK on a standard SkeletalMesh, I can grip the different parts of the mesh and move them around. When using the GrippableSkeletonMesh instead (to get better control on the gripping logic), I don’t manage to grab anything but the root bone of the Skeleton. Simulation works fine. Is maybe a bug in the plugin? Or am I missing a setting here?
In the template there is a “PerBoneGripping” gameplay tag that switches to using the traced bone name for the object. It is an option because normally per bone gripping is undesired. It requires a simulating skeletal mesh though (rag doll or bone chain).
If you are manually gripping then passing in the bone name / transform of the bone to the grip command is the same thing.
Be warned that overlaps cannot return a bone name, so either you would have to iterate over the closest physics bodies from an overlap or not use them.
If you are talking about posing a skeletal mesh though (as you seem to imply from IKing to match the hand) then you are out of luck here, that logic HAS to be custom as you cannot simulate and then unsimulate a bone chain and have it retain position without snap shotting the pose in the animbp (pose returns to the animgraph one on unsimulation).
You can run that manually with a custom grip type and performing your logic in the TickGrip function.