you definately need a camera to use the oculus. if you want to be able to move said camera, it will have to be on a pawn. if you want to be able to walk around, you need to have a character(which is a specific type of pawn) . If you dont have your character Blueprint dragged into your scene, then it will be Created wherever the PlayerStart component is located. And yes, You need a Game State, Game Mode, and Player controller to play it. If you make you own and want to use them go to edit > project settings > Maps and Modes and you can switch them under the default modes tab. If you want to test out your scene in your oculus, all you have to do is press the drop down arrow on the play button and select VR Preview (you oculus needs to be plugged in and on before you open your game). If you want to save changes to speed and whatnot, you HAVE to make those changes while you are not in play mode (and press ctrl s to save them) . Hope that helps!
Do I need to set up a camera, pawn, player controller etc if I want to package content to use on Oculus Rift?
I’ve been learning Unreal for the past two months and now have a setting I am happy with that I would like to start testing using an Oculus Rift DK2 . I’ve noticed that when I’ve played my level within the editor certain objects(?) are automatically generated within the World Outliner such as a CameraActor, DefaultPawn, GameState, PlayerController, etc. I am wondering if I need to create these or define these “objects”/settings prior to compiling and packaging for use on an Oculus. And if so, how to create them and set them up such that they are linked together.
Related - I would like to know how to keep the settings that are auto generated when I click play but keep certain changes I’ve made. For example, how to keep the changes I made to the DefaultPawn’s max speed during a play session without having them reset when I stop play mode and start another play session.
Relevant(?), I am working on an arch-viz project where the player is walking around a floor with a few animations and triggered events.
That was very helpful! But it resulted in more problems. I am essentially trying to recreate a first person game without a visible mesh from a blank template. I was able to create a character blueprint with a camera that I can control. However, when I enter play mode I can use my mouse to make the camera orbit around but cannot get the character/camera to move when pressing the keys (I’ve defined them in edit>project settings>inputs and set up a script following the first person template). What happens instead is that after about 3 second after pressing play the character seems to slide off my ground plane and begins to bounce within my ground floor (think cube floor and it looks like it bounces within it going up and down) in what seems like some kind of glitch. I’m not sure how else to approach settings & blueprint to get the character to successfully move without “sliding off”.
I dont know why i typed all of that out
Im assuming youre using the default mesh that is created when you make a new character blueprint. That thing is more useless than employees at radioshack. What you need to do is ad the camera as a child of the capsule component, add a Second skeletal mesh component as a child of the camera. In your event graph for said player, add input axis event MoveForward(or whatever you called yours) the white pin shoukd be connected to an AddMovementInput function node, the axis value from MoveForward shoukd be connected to the scale value. Then another node Get Actor Forward Vector shoukd be connected to the World Direction pin for the AddMovementInput node.
Now assuming you are using the W key to move forward, make sure in your project settjngs input, for axis mappings, under your moveforward axis mapping, the Key is W and the Scale is set to 1
Ahh, so helpful! Okay, so I added the skeletal mesh as a child of the camera and looked over all my character settings. I had changed some settings within the physics of the character blueprint, which I think were affecting how the character behaved. I believe it was a combination of those changes or everything, but things seem to be working now! Only issue now is the Oculus, which doesn’t seem to have any regards for my collision meshes and just crashes through walls. Is this because I am testing it as a standalone game in the editor? (VR preview seemed to register movement from the oculus but oculus received no video feed. So I went into the standalone game and alt+entered my way into stereo mode).
Also, to package for use on Oculus, should I be packaging it for Windows 32/64-bit (.exe) and then alt+entering into stereo as well? I’ve been trying to package and open one of the templates (as is, no changes but changes to packaging settings) for use on my computer but I keep getting an error…see image.
That is how ive always used it with unreal. I know unity automatically has a directtorift.exe when you build it. Im sure theres something like that for unreal but im unaware of it. Aa for the oculus crashing through walls can u go into moee detail? I di t really understand what that means
I was able to package it successfully into an .exe file. What I meant is that when using the Oculus, the application seems to shift the collision meshes set up. So we end up partially walking into a wall until we hit the shifted collision mesh. This only happens when alt+enter is pressed for Oculus Rift mode. If using the same exe for use on windows, the meshes are honored and in their correct location. Any idea why?
I have not seen thaf ine before but im assuming the oculus cameras ore slightly more forward than the default. Try moving the camera back a tiny bit in your characters viewport and that should fix the problem
Yeaahh…that doesn’t quite work because the collision meshes are shifted quite a bit to the left and I want a packaged game that doesn’t shift in either stereo or mono mode (playable in both). Moving the camera to the left would mess up the desktop version…