The MainMenu Map is a single player map, therefore you could use any pawn, or even not be in VR and use mouse and keyboard for the menus.
The pawn spawning in the MainMenu is found within MainMenuGM under “MainMenuPawn”.
In the latest version I just kept one “Avatar_Master” pawn and changed its color depending on the Avatar choice since it served nothing having 4 identical pawns.
So in brief you could keep your pawns in the MainMenu because everything is resetted in the next level, excepting the GameInfoInstance, GameState and everything saved under PlayerSaveGame.
I think I’ll keep now 3 versions that will be identical, but with one exception: Version 1 will be singleplayer, Version 2 will be multiplayer with LAN/SteamVR, and Version 3 will use multiplayer OculusNetwork and Avatars. Because there is fundamental differences in sessions logins and because you can’t have any references to SteamVR in Oculus game, it is absolutely not possible to have a single template able to work with both SteamVR and Oculus with the same .exe
Hello ,
the template is awesome and works already very well. We here had one issue, that the 2nd Vive player has issues with the teleportation. I looks like player which joins the session uses the chaperone from the Host Vive system and the teleport does not result in the teleport preview of the 2nd player. We have very different sizes for our different vives, thats why it is a difference of around 3 meters. Is it a bug or did I mess up the configuration somewhere?
We would also enjoy to have a Network-mode where you can join by just entering the IP-address in the main menu or config-file. I believe in times of VPN & network simulation tools this might by handy for some people
Thank you a lot for this nice peace of work. Best wishes.
@Udpate I added you a screenshot and a description:
Screenshot from the vive viewpoint of client
The chaperone and avatar you see is from the host
The clients vive-area is less then half the size of the host -> wrong area in preview for teleportation preview
The zero point does not fit - this is actual pretty bad because it is very hard to tell where you will end up after teleportation. This propably only is so bad if your vive areas are pretty different. Whit areas with more or less the same size you might not even notice the different in teleporation target and result.
Depending which player uses the view chaperone button - the chaperone of this vive station is used and visible for both players
The controller rotation for teleport direction is totally awsome!
Of course I cannot try every scenarios so it’s with feedback like this that everything works well.
I’m currently rebuilding the Login menus to be able to select sessions. I’m presently cursing all names at Oculus Subsystem processes.
Also I had a small thing I forgot in 2.2: If you select Vive or Touch controllers as mesh, the incorrect material will be applied. Quick fix for this, add a Controller Mesh Switcher in MotionControllerBP/Controller Initialization:
Also, I know the Vive Tracker has some holes in the Mesh. It’s because I did a rapid STL (taken from HTC Vive Tracker website) to FBX conversion without fine-tuning in Maya. I’ll give it an hour of love in Maya and everything will be fine.
Do you know of a good way to clamp the input of a trigger for the hand animations to say 99%. I’ve noticed that on Vive controller that have a “sticky” trigger the hand animation will stay closes unless you give the trigger a light flick. I’ve noticed this with one of my home controllers and a few around the office. It’s certainly a hardware defect, but common enough. VRTK had a similar issue early in it’s development, but someone must have sorted it out later on because that issue went away.
Maybe use exclusively axis value coming from ViveTrigger (or in fact MotionController R/L Trigger Axis) rather than the pressed/released values found in MotionController R/L Trigger.
What I would do is to clamp axis value between 0.1-0.9 and rescale then between 0 and 1.
If it is a common occurence I’ll put it in the next version.
2.2 is awesome and fixed any problems I had previously, amazing work! I now have multiple vives connected without much of an issue. One thing I stumbled upon, and could not figure out yet, is that you cannot teleport while your holding the trigger down at any point. Any idea how to make it so you can teleport regardless of if your using the trigger?
, still having issues with stereoscopy in packaged versions. Running the MainMenu map in VR preview works fine, but after packaging the stereoscopy is wrong (very visible on hands - the image seems to be the same on both eyes).
I tried also setting a delay of 1 sec before calling scalability settings inside the PC as you suggested (I could not put the delay just before “stereo on” because, in the past version this worked, but now does not seem to fix the problem anymore with 2.2.
Any advice? Anybody else experiencing the same issue?
@xN31 Avatar will be brought via a plugin, since the Oculus plugin included in launcher 4.15 doesn’t have any Avatar-related functions and you need also the “find friends” function which is not exposed either. It’s more up to Oculus from which I wait for the greenlight, because Avatars for UE4 has not publicly been released yet. Should be within a month max.
For now VOIP doesn’t work out-of-the box via Steam or Oculus. I’m looking at it.
I will say I do not experience the issues that xN31 is having, I have everything packaging and working fine, even after heavily modifying parts of it. Only problem I cant figure out so far is the teleport arc getting stuck when holding the trigger (even if your not grabbing something).
For anyone that cares I figured it out. The hand is of world static type and interferes with the arc calculation when it is closed (gripping). I ended up adding the hand to the “Actors to Ignore” pins on the “Predict Projectile Path by Object type” and “Line Trace For Objects” nodes responsible for the arc teleportation.
Has anyone been able to figure out where the trace for the debug pointer is being called. I swear I’ve looked through everything and can’t find it. Only thing I was able to find was the bool for turning it on and off, but not the actually nodes that are drawing the trace.
I’ve added my own custom pointer, but I figured it’d be best if it leaned on the trace you can turn on from VRSettings.
The trace for the debug is called from the WidgetInteraction component inside MotionController BP. It is called from “Show Debug” inside “Debugging”. This boolean is, as found in the blueprints, exposed to blueprints and triggerable.
There are no nodes drawing it (it’s not a trace line as found in the bp trace function, it’s a different trace found in the WidgetInteractionComponent.h).
What I think of doing is adding an option in VRSettings to turn off Debug line outside of MainMenu, or just calling it when casting a widget-based Menu. Anyway, it always a case of turning on or off “Show Debug” in the Widget Interaction Component.
You could also link it to a standard trace, but I would not see the point.
Thanks, great. Any idea about the stereo issue mentioned? Both eyes seem to render the same image. The issue is not very visible on far away objects, but it is on hands and other close objects. I’m currently testing the template on Rift.
Hey , finally got around to giving Multiplayer a test. Server side, everything felt buttery smooth. I could even hold the Client’s controller and it pretty much felt like my own. Client side, the teleportation was extremely laggy and the other player’s movement was equally laggy. - I tested on both LAN and over Steam letting both computer have a try as the host through both settings, but still got equal results.
Completely fresh project, running two Vives over wired network. Hope this helps.
: Quick update about the stereoscopy issue. As said, the problem is just on packaged builds (also the binary one you packaged). The cause seems to be IPD set to zero (if you are experiencing this issue you can check with the console command “stereo show”). Apparently, the Get User Profile node inside “Scalability Settings” in the PC is returning 0 as IPD value. Forcing a value (e.g. “stereo e=0.064”) fixes the issue.
, I am sorry ,I just came back. Good job, man, Thanks for all your hard work, this template is very very helpful. Thanks man, I can’t wait to test V2.2 tomorrow,
@pixelvspixel Ok. I’ve a little bit less tested with Vives, so I’ll do my homework. @xN31 I’ll remove this automatic IPD setting. Anyway it’s superfluous so to avoid problems it should be better to remove it.
For the rest, I take note of all problems. I have some important university duties to do in the next few days and I’ll issue another version addressing these problems ASAP.
Finally had time over the weekend to play a Little,
I’m very new to the UE4 engine, so not really sure of the best way of doing different things. But If you get 5 Mins, www.lamb.uk.net/ue4/ChangedFiles.zip
Note I really dislike Teleportation so I disabled the teleport links and added a grab type move - uses the triggers, the old trigger function move to the Grab function on the VIVE. Use the Trigger on either wand to ‘grab’ the world and pull yourself forward.
I’ve not got access to an Oculus so not altered those functions so please - if you get change have a look on the vive.
I’ve added a return node to your MotionController BP to return the Actor reference to the Avatar Master when it tried to grab an object.
Inside the Avatar Master Actor I’m now passing around the relative location of the Vive Wands.
I’ve added a Movement Function inside the Pickup Actor Interface to pass around the Latest Offset Movement of the wands to any actor using the pickup actor interface.
This then lets me Via the added InteractiveDrawDoor Actor have Doors and Draws that the movement gets replicated on the server so everyone sees the changes.
I’ve added a example of sliding/rotating doors and draws into your Multimap01 level as an example of the end result, - Note with a proper mesh used as a door with the pivot point on the door hinge side it rotates correctly, I’ve not embarrassed myself by including my rubbish door mesh I used in testing and the Cube static mesh’s pivot point in the middle makes the cube rotate in the middle.
Question I have I seem now to have a lot more numbers being replicated around due to needing offset readings from the last time I take a reading. your already passing around the network the final positions of the wands. is there a better way of doing what I’m doing in replicating the recent relative movement of the wands?
It seems to work and is replicated across the network, but I have a gut feeling I’m over working the task.
The files in the ZIP should drag on top of your raw 2.2 existing template.