Download

The Making of Bullet Train - Oct. 8 - Live at Epic HQ

WHAT
Ray Davis, Nick Whiting and Nick Donaldson join Chance and Alexander on the livestream to discuss the creation of the new Bullet Train VR experience. We’ll be digging into what it takes to develop an interactive and immersive VR project, so come join us!

WHEN
Thursday, Oct. 8 @ 2:00PM ET - Countdown]

WHERE
www.twitch.tv/unrealengine

WHO
Chance Ivey - Community Manager - @iveytron](https://twitter.com/iveytron)
Nick Donaldson - Sr. Designer
Alexander Paschall - Engine Support Tech - @UnrealAlexander](http://twitter.com/UnrealAlexander)
Nick Whiting - Lead Programmer

Questions about Bullet Train? Let’s hear em!

Edit: The YouTube archive is now available here](https://www.youtube.com/watch?v=lJ1RwTnRvf8)

Awesome looking forward.

I have a question. When do we get to play with it? :inf:

Megajam…hype hype!!!

what color is ur mum

Great, been looking forward to this.
I have a couple of questions if I may:

  • Other than teleportation, did you find any other locomotion solutions that could have been used that also mostly avoided sim sickness?
  • What was the reason for removing the rendered full body? Are you finding people feel more immersed without a full body than with when hands are tracked, or was this for other reasons?
  • The closing sequence shaders… how?
    Thanks :slight_smile:

Had a lot of fun playing through this at OC2.

Questions:

  1. What method did you use to achieve the time dilation effect in the level, but not impact the player’s own actions (such as shooting and throwing)?

  2. What aiming algorithm did you use to have the thrown bullets hit a target? I can’t believe that I was really that good at throwing them. :slight_smile:

  3. What was the target hardware for the demo? Does Bullet Train run on Oculus’ recommended minimum specs?

  4. Any UE4 engine changes made to get Bullet Train running, or is it stock 4.9.2?

  5. Any challenges in working with the Oculus Touch?

Thanks!

  • Dave

Questions:

  • This looks visually better then the Showdown demo, did you made more “smart tweaks” ( blob shadows, shaders “compression” and so on ) in order to make it look better without reducing too much the visual quality?

  • The reflections on the floor looks good, realtime or there is also a trick there?

  • Hypothetical specific question: let’s say you’re not using the Oculus “controllers” ( very precise and realtime ) but a mocap suit which stream realtime movement inside the scene ( with a bit of delay ); since you can’t have 100% accurate aiming because of the delay, what will be the best method to compensate the inaccuracy of the aiming? mocap suit is the one from Perception Neuron, you can find some info and videos here

  • It would be possible to create blob shadows based on the light “angle” so that it seems realtime? also add multiple blob shadows from different light sources so that they fade away based on the light position?

Cheers

Can I play it on my trip there?

Question:

Security camera displays, portals and fake reflections can be done with a Scene Capture 2D. How would your recommend making this technique work in VR and not look flat? Any alternatives? Thanks!

+1 for that, can think of plenty occasions to use such a transition :slight_smile:

How many realtime lights do you have in the scene. Are you using Lightmass Character Indirect light volume technique like the showdown?

Have you used monoscopic rendering for the background elements in this game, or researched into it?

What about Motion Sicknes? This is a very serious issue :S

In the QA session of this video,

some guy raised a question about the jitter of the motion controllers. I think Whiting mentioned Kalman filter, but what’s the name of the other filter he mentioned? And I don’t quite understand Donaldson’s solution. Wish somebody can give me some hint.