Hello ue4 dev’s,
I have an oculus project that works, and now i would like to show it to someone who only has a vive. Is there an easy way to convert? or do i have to repopulate my scenes with the vive controller?
Awaiting your knowledge,
Hello ue4 dev’s,
I have an oculus project that works, and now i would like to show it to someone who only has a vive. Is there an easy way to convert? or do i have to repopulate my scenes with the vive controller?
Awaiting your knowledge,
No one has an idea? might revive do this for me?
Hello ,
i made a training tool for a module change of a machine with Oculus and the touch controllers (using thumbstick, trigger and grip buttons for interaction). I tested the resulting .EXE on Vive without changing anything and it worked without problems in the same way (also the buttons). The only difference was the view height - in Vive it was about 20 cm lower.
Thomas
Yes you can update your main Spawning character and adjust setthings for Vive. There is few templates already have 2 versions for Vive and Oculus, you can use something from there.
Tommorow i will have some time to test this.
i hope it will be as easy as you are saying.
Thank you for the information.
Yes switch to Vive, it’s got better performance, it’s great mix with Unreal Engine, Oculus style more for simple games from Unity
You dont need to change anything, but 3 things to watch:
I don’t seem to get it working. i have a working build for the oculus rift on the vive pc but i do not get the game running on vive.
Can i get a link to some of those tools or templates your talking about?
This is not true at all. There are advantages with the the Vive but if one of them has a performance advantage it is the Rift. This is due to some additional tricks in the driver to keep the experience smooth during occasional frame rate drops.
And Unreal vs. Unity has nothing to do with it.
This is strange. It should at least start with no modifications.
I did not stray from the vr template from UE4
Are you crazy? Oculus has huge problems with tracking system, even with 3 cameras they use it’s still lower tracking than Vive, please do not push Oculus as better than Vive device, it’s not true at all.
I never said Rift is better, I think we are misunderstanding each other. Vive has better tracking performance for rooms scale that is true. I was talking about rendering performance, the Rift performs better on the same hardware.
Still, writing “Oculus style more for simple games from Unity” is ignorant about both the Rift and Unity.
I have designed my VR game to work on both the Vive and the Rift. The only thing you have to do to switch between hardware systems is physically unplug one and plug the other in. No code recompiles or reconfigurations are necessary.
The key is to abstract away the VR hardware via a hardware abstraction layer (HAL). UE4 already does this to a large degree, but there are still going to be some hardware differences (such as the play space origin, HMD height, etc). The best practice is to create your own HAL. Adding support for additional hardware becomes very straight forward and easy: Just add support for it in the HAL and expose access to it through an API. The design principle should be, “use the feature if its available, but otherwise fail gracefully”. So, if your user is using the Oculus Touch, Vive Knuckles, or Leap Motion, your game shouldn’t really care where it gets input from, it should just say, “set the thumb bone rotation to this value on whatever character we’re controlling.” Likewise with haptics support. Likewise with motion controller support. Likewise with HMD support. Adding support for the PSVR, Rift, Vive, whatever, should all be done in the HAL and the API would be unchanged
So there is no quick and dirty way?
Think about it like this: You are at a UN assembly. There are a ton of foreign speakers. You are listening to a speaker who is speaking in chinese. You don’t understand chinese, so you need a translator to translate what he’s saying from chinese into english. You find a translator who speaks into an ear piece and can understand what the speaker is saying. However, the translator only understands english and chinese. The next speaker comes up and begins speaking in japanese. You can’t simply just take the chinese translator and just tell him to translate japanese into english, he doesn’t speak that language. Sure, japanese and chinese seem similar to the outside observer, but they’re nothing alike. You need another translator.
The trick is to take all of your translators, put them into a box, give it audio inputs (speakers) and audio outputs (microphones), and when a speaker of any language starts speaking, the right translator picks up the microphone and translates it from whatever language is spoken, into english. English is your common baseline you operate off of. This system is your hardware abstraction layer. Now, let’s say that a new speaker comes in and speaks in swahili. None of your current translators understand swahili. What do you need to do? Find a translator who can translate swahili into english, your baseline language, and put them into the box. As long as the rest of your systems can understand english and operate off of the english language, nothing else needs to be changed.
When you design your VR hardware HAL like this, you’re ready to support whatever hardware the VR market wants to throw at you. Oculus Rift? HTC Vive? PSVR? Oculus Touch? Leap Motion? Knuckles controllers? Haptics suits? Gloves? full body motion tracking? Kinect? You can support it all with minimal extra effort. Unreal Engine may try to support some of the hardware platforms out there, but all they’re really doing is giving you access to the speaker and try to be helpful, but you have to translate the data into something meaningful to you.
It is rather quick no matter how you handle it. You can do it dirty but there is little need in this case. That your project don’t work at all is an anomaly.