Not really, simple pickups are possible to do well in 5 minutes or so. I just get bothered by the “simple” part of that, I want interactive held items that feel attached to you and that is a tougher cookie in UE4.
For basic pickups you have 4 easy to attain results:
Attaching without collision
Physics handle and accept the lag
Physics constraint and accept the lag
Forcing the object to follow the hand and accept that it will wobble.
For many, any one of those is probably fine, AltspaceVR uses the 2nd or 3rd for its vive tavern room and people seem happy enough with it. With the first you could treat it like melee combat systems others have made and run traces every frame along the length of the object to get collision and that would work somewhat too.
I tested many platforms solutions and here what’s working:
1)For matinee: Using the same matinee to move both platform and Vive_pawn. Uncheck dynamic shadows.
2)For simple move platform: Uncheck dynamic shadows
3)Control the platform directly with playercontroller
4)Control the platform via AI controller.
Damnation. I should have thought to remove dynamic shadows before, my bad.
All other means of platform transportation induce some frame-skip problems. I’ll put an example of each in 1.7 @AlexKlwn If you’re using a pawn, everything should be at 0,0,0 in your pawn:
In the pawn moving/rotating Camera has no effect, only CameraRoot will change POV. If everything is at 0,0,0 and camera is set at Lock to HMD, the engine should pick up automatically the location of the HMD and use it as the camera.
If you’re using a character, I found personaly that it works if you put the capsule half-height and radius at 1 and everything else at 0. Don’t know if I’m wrong but this way both pawn/character are at correct height
@AlexKlwn The pawn is already in the templates found in the first post. Better use the last one, although forget about the grab function. Next update will include working platforms, free-moving pawn and (possibly) some grabing solutions. Again, the goal here of this open-source project is to provide as many well-working example as possible and therefore provide devs the ability to rapidly test and build their pawn to drop in their own experience.
Only had a little time to work on grabbing more today, ended up being sidetracked playing with IK and some added attachment points on your controllers. Its really hackish, not sure it could be incorporated in a sensible manner, however it is really freaky feeling.
Strange thing happening here. I use some sprites in my project. They get rendered on my left eye, but not on the right eye. Any thoughts on this?. Everything else renders on the right eye, just not the sprites.
Hey Proteus, great template by the way, has been extremely useful.
Slight issue with static meshes attached to the VR controller and world (Ultra) scaling, wondered if anyone might be able to help.
If I attach a static mesh to one of the VR controllers in the Vive Pawn, when I activate ‘Ultra Mode’ the meshes that are parents of other sub-meshes do not seem to scale correctly. I have other static meshes attached that do scale correctly, seems to be something to do with a mesh that is parented, can you think of any reason why or a solution? I am guessing this might be a bug.
@dreetjem I have 0 experience with sprites, but from past experiences I can tell you this eye discrepancy has been encountered before with the DK2. As an example, with the “Paris Demo DK2” the AO on the ceiling was showing in one eye… and not another. Note that this demo was not built for VR but instead was “hacked” to work. Also, I can tell you that particles sometimes (often) show in one eye an not another (was with DK2 – still is on Vive). As I don’t know with sprites it’s maybe the same thing. I would qualify it as a bug unless someone very knowledgeable of stereo rendering shows up. Open a ticket or am I misleading? @Fnordcorps I’ll try that morrow.
Been playing around with it a bit so far, was hoping to get some more stuff put together but got some other projects on this week.
I think im going to do a different version thats more events based for different item pickups so will try and do that and merge back in if it works alright but I may need to fork it and do some more significant changes. I intend to hook up pickupable items with a data table to store some information about them and cast that back into the player controller. I put together a little lightsaber demo quickly and a bit of a bow and arrow 1 and will merge them in once I get those other changes in there too.
@Fnordcorps yes, I think I only scale the Vive controller static meshes and the world to meter parameter when I send to Proteus. So anything parented under those 2 controller will get scaled properly.
If you need something to scale properly, you need to either parent under the controller static mesh, or wire the scale in the Ultraman mode’s timeline node update, shouldn’t be too hard.
Ok got it set up so that gripped items will track with the controller, collide/trigger on hit as is normal for Move() commands, and will get the late update on position in the rendering thread so that they match the controller without wobble. I need to get some things set up for offset rotations and positions and then I guess I can package the custom MotionController.
Sorry guys, with much frustration trying to stream properly, it seems that I need to get a small hub to set everything up properly. USB Wifi adapter just won’t cut it(with my USB capture card is also working that is, not enough bandwidth.)
On the other hand, I might want to save my pawn to a different blueprint or even a folder, cause my wiring style is pretty different compare to Proteus, so it feels weird that after each update I have to go through a lot of rewiring and then it’s time for next version update. I will still update from time to time so Proteus feel free to integrate, and I might just copy something I need from your template(like the scalability settings and see how that goes.)
I probably won’t be able to make it to this weeks update(cause Rocket League free weekend, I go and smurf and have lots of fun, so not really any dev going on for the weekend), however I think disable dynamic shadow is not a good fix for platform, and that platform animation should have linear curves(so you do have start up/slow down to make people sick.) I will see what I can contribute here as well.
And I need to learn how to work with better, for the life of me I can’t figure out a way to update my forked branch to update to Proteus’s latest using the web interface. That’s my status update for this weekend. Now before sleep I have to do some relax tilt brush to release my stress dealing with streaming failures.
Hi, all my static meshes are attached to the vive controllers, the problem is that attached meshes that have a child are not scaling, meshes with no child, scale with no problem. I tried hooking up the non scaling meshes to the same nodes in the blueprint as the controllers but they do not scale correctly.
On another note, does anyone else find there is a sort of tiny ‘micro-judder’ on the controllers tracking and with meshes/text attached to the controllers. Very odd but some (but not all) of my static meshes and text renders attached to the controller have a very small amount of lag, as in they dont seem like they are quite attached to the controllers. This makes text slightly blurry when moving the controller etc. The text renders with no lag is crystal clear. If I bring up the vive menu the tracking on the controllers is perfect. This happens on the basic/blank1.5 template for me
Well it is defaulted to be enabled but you might want to check “vr.EnableMotionControllerLateUpdate” is set to 1. If you somehow turned that off then it won’t be performing the late update on the rendering thread.
I get these errors when creating a new project with the template 1.6:
Info Failed to load /Game/Vive/Controller/Vive_Control_Skeleton_AnimBlueprint.Vive_Control_Skeleton_AnimBlueprint_C Referenced by Vive_Control_L_GEN_VARIABLE
Error /Game/Vive/Controller/Controller_Vive_MAT : Can’t find file for asset. /Game/Vive/Controller/controller_vive_substanceambient_occlusion
Error /Game/Vive/Controller/Controller_Vive_MAT : Can’t find file for asset. /Game/Vive/Controller/controller_vive_substancenormal
Info Failed to load /Game/Vive/Controller/controller_vive_substancenormal.controller_vive_substancenormal Referenced by MaterialExpressionTextureSample_3
Info Failed to load /Game/Vive/Controller/controller_vive_substanceambient_occlusion.controller_vive_substanceambient_occlusion Referenced by MaterialExpressionTextureSample_4
There is an error in the Controller_Vive_MAT where texture is not found
Error [SM5] (Node TextureSample) TextureSample> Missing input texture
Hi, I am having a slight problem with a teleport offset when in ‘Ultra’ mode (in 1.5 template), basically I have it set up so that you can be in Ultra mode (world scale x 30) , then using the teleport system you can point down at the map below you and simultaneously teleport to the selected spot and shrink the world scale back down at the same time. The shrinking back down works fine however the location I teleport to always ends up being off location by roughly the same amount every time. This varies depending on where within the Vive bounds I am standing when I make teleport action. When at normal size all the teleporting locations work perfectly.
Basically I assume I need to offset from something else when teleporting in ‘Ultra mode’ due to the world scale but can’t work out what. Anyone any idea what exactly should I be offsetting?
I note that the Level Blueprint is calling:
r.setres 2560x1440
r.ScreenPercentage 150
(8 294 400 rendered pixels per frame, before downsampling)
I’m new to Unreal, but if I understand correctly, r.setres should be the actual resolution of the display panels,
and r.ScreenPercentage is used to set the offscreen render resolution which is then downsampled to the ‘setres’ resolution for output.
According to Valve
http://.vlachos.com/graphics/Alex_Vlachos_Advanced_VR_Rendering_GDC2015.pdf
panel resolution is 2160x1200, and oversampling is recommended to be 1.4x.
So, shouldn’t these be:
r.setres 2160x1200
r.ScreenPercentage 140
(5 080 320 rendered pixels per frame, before downsampling)
With this change, we are only rendering 61% as many pixels, which has got to be good for performance!
Even if you wanted to set r.ScreenPercentage to higher than 140, r.setres should always match the panel, shouldn’t it?