[Gear VR] Engine Changes Requested (for Mobile VR Jam)

Has anyone else noticed this ‘effect’?:

If you turn your head very fast you can catch the ‘edge’ of the render - hard to explain but it’s like the engine is a frame or two behind your movement when turning very fast resulting in a lower field of view to the direction you’re turning until it catches up. I can see this effect on the GearVRLighting template as well at 60FPS. In my prototype where there is more going on and the player is required to often turn quickly it becomes more noticeable :frowning: It’s only like 5% of the side of the screen missing and only for a split second but once you notice it you can reproduce it …

Yes, I noticed the same thing earlier. It’s like the outer edge of the field of view sticks in place for a moment, in the direction you’re turning in.

Huh, the outer edge should only bring in black via async timewarp. My guess is that you’re actually seeing timewarp tearing. Which shouldn’t happen, but can occasionally if the GPU is stuck with a very time consuming draw call. If you have a single draw call with a ton of time consuming geometry in it, and this is totally counterintuitive, try breaking it up into 2 or more draw calls. That can sometimes be enough to fix any tearing issues.

I haven’t had a chance to clean it up and check it in, but here’s a change to CalculateStereoViewOffset() that should give you a basic head/neck model that matches what we were using for DK1:



	if( StereoPassType != eSSP_FULL )
	{
		check(WorldToMeters != 0.f)

		const int idx = (StereoPassType == eSSP_LEFT_EYE) ? 0 : 1;
		float EyeMul = (StereoPassType == eSSP_LEFT_EYE) ? -0.5f : 0.5f;
		const FVector HeadModel

		FVector TotalOffset = FVector(0, InterpupillaryDistance * EyeMul, 0);

		if ( 1 ) //bUseHeadModel)
		{
			TotalOffset += FVector(0.12f, 0.0f, 0.17f);	// head/neck offset in meters
		}

		TotalOffset *= WorldToMeters;

		ViewLocation += ViewRotation.Quaternion().RotateVector(TotalOffset);

		// The HMDPosition already has HMD orientation applied.
		// Apply rotational difference between HMD orientation and ViewRotation
		// to HMDPosition vector. 

		const FVector vHMDPosition = DeltaControlOrientation.RotateVector(CurHmdPosition);
		ViewLocation += vHMDPosition;
		LastHmdPosition = CurHmdPosition;
	}


I somehow read this in the voice of Geordi LaForge and have been laughing about it softly ever since.

Based on what you said, I think I know what might be causing it.

Thank you for sharing the bleeding edge stuff and being there with and for us, JJ.

:slight_smile:

Of course it’s also possible that it’s something different about 0.5.0 that I haven’t tested yet. I’ll take a look tomorrow.

,
I pushed a build to my device and wanted to test for the global menu but it didn’t work. I went into inputs and setup the global menu under action mappings but I assume I’m doing something wrong. Also I saw that you mentioned a OVRGLOBALMENU command in your first post. Is this a console command? I wasn’t able to find it when I did a dump of the commands. I tried to setup swipes too but maybe that’s for controller only? Any direction would be most helpful if you wouldn’t mind. I’m pretty much restricted to blueprint as I can’t code c++ so hopefully epic will give us some more as we go, especially with VR Jam having started. Thanks.

Currently global menu is implemented via an exec command. So you would detect the 0.75 second press in a blueprint and fire off the OVRGLOBALMENU exec command at the end. The blueprint could also display UI feedback during the press.

At some point we need to add a sample blueprint to demonstrate this, but it should be fairly simple.

Still looking into the gestures. If we add that, we might automate the global menu a bit more as well.

@a.carter1182:

Yes, OVRGLOBALMENU is a command that you call with an Execute Console Command node. You may find this commands definition, along with others for the Gear VR, in Engine/Plugins/Runtime/GearVR/Source/GearVR/Private/GearVR.cpp within FGearVR::Exec(). You don’t want to use the Global Menu gamepad event as I believe that is for the Xbox One.

As for swipes, that is item number 4 in my original post, and are not implemented yet. I haven’t looked into what it would take to simulate swipes with your own code yet. I’ve purposefully designed my Jam entry to not require swipes knowing that they may not make it in before the deadline. Sorry.

Edit: Looks like JJ beat me to it. Thanks, JJ!

And I believe the key you want to map in input is either Escape or Back. I don’t have it in front of me right now.

Heya VR Peeps.

I’ve just checked in JJ’s virtual head model hack into my VRJam_4.7 branch for you all to grab:

https://github.com/DavidWyand/UnrealEngine-Fork/tree/VRJam_4.7

In my opinion, this really improves the user’s experience. A simple head roll now allows you to look around corners.

Enjoy!

yeah sorry for the poor description but that is it - it has a vertical black bar on the side you are turning - so must be this timewarp tearing. By draw calls do you mean the HUD blueprint node “draw HUD”? - I do have a rather large translucent crosshair texture in there - maybe I try pulling it out and have it in the 3D space in front of the camera - probably looks better anyway …

Black creeping in on the edges when turning your head rapidly is normal and expected. There are steps we can take to reduce it further, but it’s expected to be there. The async timewarp is the Oculus code waiting until the very last moment to sample the headset orientation one last time and slide the rendered frame into the very latest position on the screen. This brings in black on the edges where there isn’t information because it was off screen at render time. We could, for example, render with a wider FOV when turning rapidly in order to have a wider frame to sample from in the timewarp, but there are pretty severe perf implications to that which can just make things worse.

What I was referring to as ‘timewarp tearing’ is when you see a vertical shear line in one eye where you see one frame on one side of the line and a following frame on the other. That shouldn’t happen, and is usually caused by taking up too much GPU time in one draw call.

Large translucent draws are usually bad for GearVR perf in general, since overdraw will tend to kill perf. Draw HUD isn’t inherently bad, it just depends on what you do with it. On Note 4 class hardware, you want to touch pixels as few times as possible. Large translucent areas are bad for that.

Great thanks for the explanation and description JJ!

Thanks JJ and for the help. I’ll make those changes after work today.

Ehi , thanks for the great work! Just a question: I’ve got a blueprint project I want to deploy on gear VR, so I added and empty code class to the project. Unfortunately, this in 4.7.5 on mac seems to break the Android packaging:

Am I doing something wrong?

, I’m using your branch and I’m not able to deploy. I’m getting a security error message saying it isn’t signed for VR, which seems like an ossig issue. I put it in the following folder: Documents\Visual Studio 2013\Projects\UnrealEngine\Engine\Build\Android\Java\assets, which is equivalent to where I had it in the default installation from the Launcher (that was working fine). Launching to the device with “By the Book” and “Shipping” also tried every other setting. JJ, is this the canonical location for the ossig file? I tried putting it in the Build folder in the project as well but with no success. Restarted the editor and the phone both as well.

Did you make sure to enable the Configure the Android Manifest for deployment to GearVR checkmark box in the Project settings under the Android section? I dont have a my Gear here but I am pretty sure that will fix your issue. You can read more about Gear VR setup here in the Gear VR Quick Start Guide(I linked to page two because that has all the Android setup stuff in it).

, yes that checkbox is checked. I have narrowed it down to this: using either the batch file or the ProjectLauncher, it works fine in the ‘Shipping’ configuration but not in the ‘Development’ configuration. However, with the ‘Shipping’ configuration it does not show me my log messages.

Hi , if you have time could you elaborate on this a bit? I didn’t experience the DK1 and am trying to visualize the mechanic and understand it before incorporating the code. I’m interpreting that last sentence to mean that if you roll or tilt your head left or right it causes the view to move forward and around an object? In my mind, that doesn’t sound appealing or desired, so I think that I’m misunderstanding.

I just read through your thread here, [Gear VR] Virtual Head Model Support? - XR Development - Epic Developer Community Forums, and the referenced document (https://docs.unrealengine.com/latest/INT/Platforms/Oculus/BestPractices/index.html#virtualheadmodel) and unfortunately am still not able to visualize what this does exactly. I’ve read a few times now that it’s needed and important, but why?

Nevermind, see following post.