WHAT
Ray , Whiting and Donaldson join and on the livestream to discuss the creation of the new Bullet Train VR experience. We’ll be digging into what it takes to develop an interactive and immersive VR project, so come join us!
Great, been looking forward to this.
I have a couple of questions if I may:
Other than teleportation, did you find any other locomotion solutions that could have been used that also mostly avoided sim sickness?
What was the reason for removing the rendered full body? Are you finding people feel more immersed without a full body than with when hands are tracked, or was this for other reasons?
This looks visually better then the Showdown demo, did you made more “smart tweaks” ( blob shadows, shaders “compression” and so on ) in order to make it look better without reducing too much the visual quality?
The reflections on the floor looks good, realtime or there is also a trick there?
Hypothetical specific question: let’s say you’re not using the Oculus “controllers” ( very precise and realtime ) but a mocap suit which stream realtime movement inside the scene ( with a bit of delay ); since you can’t have 100% accurate aiming because of the delay, what will be the best method to compensate the inaccuracy of the aiming? mocap suit is the one from Perception Neuron, you can find some info and videos here
It would be possible to create blob shadows based on the light “angle” so that it seems realtime? also add multiple blob shadows from different light sources so that they fade away based on the light position?
Security camera displays, portals and fake reflections can be done with a Scene Capture 2D. How would your recommend making this technique work in VR and not look flat? Any alternatives? Thanks!
some guy raised a question about the jitter of the motion controllers. I think Whiting mentioned Kalman filter, but what’s the name of the other filter he mentioned? And I don’t quite understand Donaldson’s solution. Wish somebody can give me some hint.