VR Game Template

[=;188482]
Hello, is actually really simple to achieve (and I had seen it attempted back with the DK1 and the Hydra) anyway I haven’t integrated it into the template (although I might soon) but if you want a working example here’s a project I threw together it’s bare bones but has everything working just like EvilMech Runner
[/]

Thank you so much for this! is great! Only problem (which i have also encountered in my testing) is that when you tilt your head up and down (pitch-wise) while standing still you also move. I haven’t been able to compensate for it though orientation tracking. It seems there is a seepage of data from the rifts led markers that is being interpreted as positional change instead of rotational. I thought about isolating one led and tracking it only for position to get more narrow and accurate date for positional tracking on the Y axis, but that would require dabbling with the SDK which isn’t in my capabilities. Do you have any idea as to how to limit the tracking so that only perpendicular movement would be translated into movement?

Hi ArieCrow, i had a similar problem than you in my own project… there is no seepage of data from the rift markers, what is happening is that the “Device Position” provided by the SDK is the location of the front part of the rift, which is NOT the center of rotation of your neck.

You have to get “Device Rotation”->“Get Axes”->“Break Vector” and subtract the change in height caused by the pitch rotation of your head to the “Device Position”->“Break Vector”->“Z”. It’s not perfect though, since you have to approximate using a factor which actually changes depending on the size of the head (i guess you could also approximate by using the IPD but i didn’t go that far). Still, it’s better than no correction at all!

Hello again.

After some work we finaly found a way to make the hightlight work without the oculus, we just change an element in one of your blueprint and it work with and without the oculus. I can take some screenshot if you want to see what we do.

[=PatimPatam;189776]
Hi ArieCrow, i had a similar problem than you in my own project… there is no seepage of data from the rift markers, what is happening is that the “Device Position” provided by the SDK is the location of the front part of the rift, which is NOT the center of rotation of your neck.

You have to get “Device Rotation”->“Get Axes”->“Break Vector” and subtract the change in height caused by the pitch rotation of your head to the “Device Position”->“Break Vector”->“Z”. It’s not perfect though, since you have to approximate using a factor which actually changes depending on the size of the head (i guess you could also approximate by using the IPD but i didn’t go that far). Still, it’s better than no correction at all!
[/]

Thanks PatimPatam :slight_smile: Thats a whole of a lot better (even with the micro slip ups). I’ll try getting into the IPD too. It’s really fun to walk freely and use your hands with the lesp motion to interact with objects. Really ads to the immersion.

[=;190559]
Hello again.

After some work we finaly found a way to make the hightlight work without the oculus, we just change an element in one of your blueprint and it work with and without the oculus. I can take some screenshot if you want to see what we do.
[/]

Yes, please do this.

[=veiovis;192210]
Yes, please do this.
[/]

Here is what you have on the template (VRPlayerController->CheckForInteractive : http://i.imgur.com/BbIRwWr.png and we simply change with that : http://i.imgur.com/322U0vN.png
I would recommand you to check if it work but it should (like i said it work for us)

Nobody has as a blackout when he’s entering a trigger volume? I just can’t use blueprint because of bug!

Hi. Thanks a lot for making this, mate. Instead of getting swamped trying to figure out unreal, I’m now getting swamped trying to figure out how you made what you made. Trust me, that’s a huge improvement!

I’m trying to figure out how to edit the character’s skeleton behavior. More specifically, I think the way the exploration character moves his torso based on head motion (DK2) works very well. I would like to apply that to a seated character (so with immobile legs), combined with the hands moving corresponding to how the controls are manipulated (like in the cockpit game). I have no idea where to begin; just what file should I edit in what menu? Help would be much appreciated.

How to adjust body positions

Hi , thanks a lot for making this, it’s a great help for someone like me, struggling with to come to grips with unreal engine 4 with limited programming experience for the purpose of experimenting with the DK2.

I’m trying to make a seated experience. To do this, I aim to have the hand/arm/shoulder move based on the controller input (like in your space shooter). However, I would also like to have the torso move along with the DK2s head position (like in your first person explorer).

I’m trying to figure out just where in the blueprint those two things happen, so that I can learn from/integrate/steal them. Also, I’m trying to figure out how to find/change the skeleton’s base position in the first place.
In the spaceshooterpawn eventgraph I see the ‘update IK’ bit, which sets the hand positions to those determined by the controls. But I don’t see how it uses that info; it feels like I’m missing something vital.

Any help would be much appreciated.

I dont know if is supposed to be here or not, but there is a low opacity overlay of concentric circles on my viewport, ingame mode or not. Tested on both 4.5.1 and 4.6 is there a way to change this?
[/]

Yup, I also noticed issue, don’t know why. Anyone knows?

[=;193206]
Nobody has as a blackout when he’s entering a trigger volume? I just can’t use blueprint because of bug!
[/]

I had the same problem and I noticed if I lower my triggerbox, so that the player head does not hit the box and only the player body triggers the box, the blackout disappeared.

I’ve been having trouble launching my rift projects to show full screen properlyforums.unrealengine.com/showthread.php?54935-Full-screen-not-working-properly-(Image-posted)&p=196176#post196176

So I came across template and tried it, I get full screen! What part of template do I need to be looking at to understand how you achieved this?

Also, I’m currently finding that my FPS either goes to about 75 or 35, nowhere in between really. I have to drop the screen % to under 100 to get it to 75. I have AMD Radeon R9 200 graphics card and 16GB ram. That’s pretty good right? Is it strange that I’m struggling to get to 75 at 100%? or is more to do with Unreal4 not adapting to Rift right now? You mention in the video that Unity handles it better? Just wondering if to upgrade my gear or just continue developing and hope that Unreal catches up?

Another question (so sorry!) is does it really make sense for anyone to use these templates in their projects? For example there are a tonne of instructions going on, is the idea for people to take the template and delete what they don’t need to save on loading times and performance? As you can probably tell I’m very new to developing and wondering what people generally do with these templates. My instinct is to use a blank canvas because then I’m only putting in what is needed. Do people 1) use them just as a learning tool or 2) Build on top of them and add extra content or 3) Tweak them to suit their needs?

I dont know if is supposed to be here or not, but there is a low opacity overlay of concentric circles on my viewport, ingame mode or not. Tested on both 4.5.1 and 4.6 is there a way to change this?
[/]

Yup, I also noticed issue, don’t know why. Anyone knows?
[/]

Remove Atmoshpheric Fog actor from level and/or the skysphere and the circles will disappear. I do not know what it is with template that causes effect.

okay, thx, will try it

, was bugging me too. It’s actually a very simple fix, and the answer was in the demo scene all along. If you look at the volume triggers inside of the blueprints of the percentage changer, you will see that the fade to black doesn’t occur in those volumes, but it will occur if you collide with either of the demo interactive objects. Now, go inside of the VR Player Controller and in the Event Graph, look to the section labeled “Show Camera View If Out of Camera Range or Colliding with Object”. Right after the first branch you see it making a call to the Post Processing Volume. Shortly you also see it setup an array for Collision types. (World Static, World Dynamic…), notice that Project isn’t listed in that array.

Anyhow, it was really simple fix. It just needed a little investigation.

I asked a lot of questions before so now I will try and make it more simple to understand my confusion. All I’m after is a template in terms of decent quality graphics running properly on a rift. Are there any tutorials on how to cut down the template to it’s bare basics?

is probably a stupid question, but I’m pretty new to UE4 and still trying to figure out the best way to do things. How would I go about accessing the position of the HMD from another actor?

I’m trying to do head tracking on a character, and while I can get the position of the player pawn or the player character, neither of these seem to follow the user’s head location when using the Rift.

I’ve tried accessing the “HMDWorldLocation” vector in the VRPlayerController, but doesn’t seem to do anything for me. Should work for what I want to do? If so, could someone be so kind as to tell/show me the correct way, I’d really appreciate it.

is what I’ve got currently:

However that doesn’t seem to track anything. Here’s what I had originally:

tracks the camera correctly, but doesn’t track the Rift as it moves around.

Edit: I see that the “Get Orientation and Position” node returns the Rift’s relative position from head tracking, but it seems that adding straight to the camera’s location doesn’t work. Anyone played around with before?

Edit 2: Solved. While I wasn’t able to access the HMDWorld Location variable helpfully implemented in the template, duplicating the method used to calculate it worked perfectly within the actor that needed to do the tracking. For reference, is the calculation:

The last vector addition shouldn’t normally be needed, but as I was using it for head-tracking I found gave a better visible result.

Big thanks to for the awesome template!

Anyone know how to enter the contex menu to change the settings? In the 0.3 video it appears when looking at the 2 spheres but that isn’t happening for me. Sounds like a key is being pressed?

[= Capone;199732]
Anyone know how to enter the contex menu to change the settings? In the 0.3 video it appears when looking at the 2 spheres but that isn’t happening for me. Sounds like a key is being pressed?
[/]

Approach the text on the wall over to the left of the two spheres (which tell you the corresponding buttons, I think it’s Q and E).
The spheres react to head orientation, not to buttons.

Q and E is for the screen percentage adjustments, I’m asking how to get to specific menu right here;