Slayemin's Working Theory on VR Motion Sickness

I am able to experience motion sickness within VR. As a developer, this is good because it allows me to isolate and identify its causes. I have developed a working model on precisely what causes it, why, and have developed some tricks and techniques to reduce its effects. I’ll share with you a few of the things I’ve learned so that we can all build better VR experiences for our players.

My hardware: Oculus Rift DK2, NVidia GTX 560, Intel i7 (8 cores)

Who gets motion sick?
After testing my VR game on over 100 people, I’ve concluded that some people are more susceptible to motion sickness than others. Some people experience it immediately and get nauteous while others never experience any sickness it at all. I think the nausea susceptibility falls along a bell curve. I’d place myself in the center, such that 50% of people are more susceptible to motion sickness than me and 50% are less susceptible. My rule of thumb is that if I experience motion sickness, then something is wrong and the source of trouble needs to be identified and fixed.

**Where does motion sickness come from? **
My working theory is that it is due to a disconnect between what the eye sees and where the brain thinks it is. When the two don’t match, we get sick. Our brains are inherently predictive machines. They’re very good at predicting what comes next. When it comes to spatial movement in the real world, our eyes look at our surroundings and measure how fast stuff is moving relative to us. When we walk or run, our speeds are relatively constant, with very quick accellerations and decellerations. If we jump from a high height, our accelleration is brief. Then we hit the ground and we decellerate very quickly. If we land on our feet, our legs will bend at the knees and offer us a cushioned decelleration. Our brain is able to handle that without getting motion sick.

When we move in the real world, our brain is predicting where we will be fractions of a second before we’re actually there physically. If it helps, think of your brain as being spatially ahead of your physical position by a few frames. You’re constantly ghosting. As long as your physical position matches the predicted position, everything is merry and no sickness. In the real world, you can actually get motion sickness. When you are out at sea for a while and you haven’t adjusted to it mentally, you can get sea sick (particularly if you can’t see the horizon). This is because your body is experiencing motion which your brain can’t predict very well. When you ride in a car and it turns, accelleates, brakes, etc. and you aren’t looking out the windows to orient yourself, your brain also can’t predict where it will be next. The sense of motion is irregular and unpredictable. Your brain doesn’t know where to ghost your future position at.

When our brain’s predicted spatial position doesn’t match with our visually reported position, our brain has to perform some sort of ugly ‘linear interpolation’ over time between its incorrect predicted position and the visually reported position. If you smash into an invisible brick wall, your brain will position the ‘ghost’ of your body into the wall at the instant of impact. Then, the eyes report a full stop and the brain has to ‘rewind’ its spatial position to match the visually reported position. This rewinding step is where people get motion sick. If this happens for a brief instant, it’s not noticeable and the effect of nausea is next to non-existent. If it happens continuously, then our brains get very spatially confused and we get motion sick.

This explains why accelleration and decelleration with VR causes motion sickness. It also explains why bad frame rates can cause it as well.

I’m sure most of you have played VR games which cause sickness by now. There are two which stick out in my mind particularly. The first is a roller coaster simulation built in UE4. It was awful for me. I got sick the first time through it, and so I had to do it again to see if I could figure out exactly what caused me to get sick. Bad idea. I got sicker. The roller coaster has a lot of turns and accellerating and decellerating forces. Even though I can see the tracks and where I’m going to go next, this confuses my brain. It’s barely noticeable, but my visually reported position is slightly off from my spatially predicted position, so my brain is constantly performing mini corrections so that the positions match. Any time a correction needs to be made, sickness accrues.
The second game was a mech game found on Oculus Share. The part which causes great sickness was when the mech would be able to use thrusters to ‘jump’ into the air. While in flight, you could change your flight path. My brain got very spatially confused and I could feel the inside of my head spinning around. That’s not supposed to happen.

When it comes to frame rate, your eyes are constantly looking out into the virtual world and observing the scene. Then, your brain is trying to validate the observed scene against its predicted spatial positioning. As we know, if the two don’t match, we get sick. When frame rate drops, the time between frames increases. This frame gap has the potential to create spatial discontinuities (gaps), which our brain has to then fill in with mini adjustments. If we are not moving at all (with both orientation and position), then the drop in framerate has no effect. As soon as we move, it becomes noticeable.

How do you design games around this?

  1. Generally, you want to avoid accelleration and decelleration. However, I would tentatively say that you can use accelleration and decelleration within a game if you are very careful with its use. Remember, the brain is constantly ghosting its position a few frames before it gets there. So long as the spatially predicted position exactly matches the visually reported position, you’re fine. This means gradual accellerations with a constant rate. Don’t be strapping player faces to a wildly flying rocket.

  2. Avoid falling and flying. Treat your virtual game world like a real life world. Design your buildings and environments as if OSHA was going to come in and do a workplace safety inspection on it. Can players fall off of a ledge? Put a hand rail on it! Can players climb up to a roof and fall off? What kinds of physical safety precautions would people take in real life to avoid injury and litigious lawyers? Put those into your game! In my game, we’ve designed a wizard tower which has a curving flight of stairs. In a normal first person game, we’d just jump right off the top stair all the way to the ground floor. In a VR game, this can cause sickness. So, we put railings on the stairs. You have to walk down the flight of curved stairs.

  3. Be on the look out for drops in framerate. Look at translucent objects and material complexity as the main culprits. Its better to have more polygons than more complex shaders / materials.

  4. Don’t take camera control away from the player. If you must, do it for a very brief period of time. In our wizard game, the only time we take away camera control is when the wizard gets killed by zombies. The wizard falls to the ground, so the camera must follow the wizards head. This only lasts for about 2 seconds. Afterwards, the wizard gradually loses consciousness and the screen fades to black.

Bonus:
5. VR UI - You may still use UMG / Slate for VR, but you have to be very limited in its usage. You’re essentially blasting an image into someones retina, so be very careful. UMG is still a good way to apply general visual effects, such as fades, directional damage indicators, and maybe a cross hair. Anything else should be designed into the environment as an interactable object / environmental interface.

Very good writeup. Do you mind if I share my own limited personal findings?

I find lighting is a very important part of causing/preventing motion sickness. In all of the experiences and tests we have worked on, we find that bright scenes with limited points of contrast are the biggest culprits for causing motion sickness. Particularly a cold-sweating, deeply uncomfortable sense of malaise type motion sickness. Bad stuff. Take for example, a bright white room, with white furniture and white fixtures. Without anything to balance out the contrast in the scene, we’ve found that this leads to sickness very quickly.

However, a dark scene with limited contrast doesn’t seem to have the same effect. There’s just something deeply uncomfortable about bright scenes.

Another area I feel contributes pretty significantly to the motion sickness is audio design. Loud droning noises in particular seem to trigger something in me, and other people I have tested with. Perhaps its something to do with disturbing the inner ear, but the effect is a sense of uncomfortable claustrophobia and inevitable sensitivity to motion sickness. It could be more psychological than physiological, but it definitely seems to play a part.

Just my 2 cents.

how do you use UMG in VR? :confused:

Interesting. So, a padded white room with a constant droning hum would cause people to feel sick?

I’d be interested in testing this to see for myself.

Very carefully. Each eye can see half of the slate workspace, so when you place your UI elements, you have to double it up – one for each eye. The placement needs to be done very carefully. You have to find the eye center position X/Y values for each eye. When placing UI elements, the Y values need to be exactly the same, but the X values will be done for each eye. There’s an X value which will cause the UI element to be ‘neutral’ position (centered on each eye pupil). Keep in mind that since you’re working stereoscopically, changing the X values from the neutral base value will cause the UI element to appear forward or backward on the depth axis. You also need to keep in mind that placing UI elements at the edges of the eye lens / screen will NOT work – this causes the UI element to be seen from the corner of your eye, and if you move your eye to look at it, it disappears.

The other thing to keep in mind is that any UI elements you place within slate will be rendered last, such that they overlay anything else. When you have a stereoscopic UI element, such as a targeting reticle which is floating about 3-4 feet away from your face, and you then look at a wall which is 1-2 feet away from your face, the reticle will appear to be floating within the wall but still be very visible. This will appear as ‘wrong’ within the scene, so be aware of this.

Like I said, you can use UMG & Slate to do UI within VR, but you should use it very sparingly. If you’re using buttons or displaying data, you’re probably doing it wrong. Treat the slate UI as if the player is wearing a bionic eye and you’re rendering stuff onto the surface of their eyeball. Also, keep in mind that people have blind spots at certain positions of their eye, so rendering a small UI element at this location will not appear visually unless the player is looking directly at it. You can find these blind spots yourself by getting a sheet of paper, drawing an X and a dot about 2-3 inches apart, covering one eye, focusing on the X, and moving the paper forward and backward until the dot disappears.

Anyways, best practice is to put UI information into the environment instead of into a screen surface, but it’s not a fixed rule. Just be aware of the limitations and draw backs I mentioned.

You can also use UMG in VR by using the 3D UI widget (lets you render UMG UIs on an object, basically).

On topic, the only thing that makes me feel any sim sickness at all is low framerate, but even then it’s minor. I’ve tested everything I can think of, including spinning (with analog stick) while circle strafing and whipping my head around - no sickness.

Now that I think about it, the IPD being too low (my IPD is 68.4, way higher than the supported 64mm) caused more discomfort than anything else, but I fixed that with lens separators.

My preferred locomotion type is 1st person, head/body-based turning and analog stick for forward/backward motion only.

I think it’s been covered here, but the bulk of my VR UI is built of a mix of 3D elements and UMG nested in widgets. I attach all of this with an extra spring arm (or more) connected to the camera. What I find being really key is the distance from the camera and the SLIGHT lag that is used on this element. It is my onion that just the right mixture of lag on this U.I. can really help make things more comfortable.

I also wanted to share some excitement I had last night, while sharing my latest build with the wife. (Note, I’ve made her very sick in the past on several occasion. Her first brush with VR had her on the couch for 3 hours.) Usually I tell her to close her eyes, move her around to the right spot or show her things with little to no acceleration. For some reason while demoing an environment to her, she grabbed the gamepad and started moving her self around with mostly traditional FPS controls. After a minute or two she asked if there was a run button.
Complete 180 from most of our previous encounters. All of this stuff is so infantile at the moment. Personally I don’t want to fall into all of the assumptions that people are making and declaring as ground rules for VR. Maybe for VR today, but I hope keep experimenting in both success and failure.

This is a great write up on Simulation Sickness and a lot of what you stated would indeed deter Simulation sickness. If you want to know more about what can cause Simulation sickness Oculus has a very detailed write up on it.

http://static.oculus.com/sdk-downloads/documents/Oculus_Best_Practices_Guide_0.5.0.pdf

It goes into much more detail about how fooling your brain with VR can cause simulation sickness and how to address issues like this. It is a fairly heavy on the science aspect but more than usable by someone who does not come from a scientific background. If you are serious about developing in VR this is a must read IMO.

Yeah, I’ve read that PDF and integrated a lot of its research into the game we’re building. I’m glad to see that a lot of my acquired VR knowledge is verified and validated by this documentation. The exciting part is that I think our two person team has also pushed the boundaries of knowledge a bit and innovated a tad.

I know a lot of this is new and we’re all pioneering the tech. With that in mind, I have a small critique. The PDF only goes on to explain the correlated empirical evidence gathered during motion sickness. They say “X” was happening when “Y” experienced motion sickness. I think the underlying cause / theory behind why someone experienced motion sickness is a shallow explanation and leaves a lot of room for discoveries to be made. We all know that acceleration causes motion sickness and the documentation hypothesizes that it has to do with our vestibular systems. Above, I propose a slightly improved hypothesis which explains all of the existing phenomena, gives additional context on underlying causes, and progresses our understanding of motion sickness while offering up ways to carefully account for these uncomfortable experiences. It’s a working hypothesis which seems to be fruitful for me, but I’d be happy to get consensus or hear an even better one :slight_smile:

I think maybe I can explain this in a slightly more mathematical way if that helps people understand what I’m trying to say:

Assume that the players HMD is initially located at position [0,0,0] with an orientation looking straight down the X axis [1,0,0], with a velocity of [0,0,0] and an acceleration of [0,0,0], at Time 0.

In real time, at some time position, Time 0+dt, the human brain is trying to predict its future position based on its perceived velocity. When velocity is [0,0,0], this is trivial for our brain to predict. This delta time is not a delta time within our game world, it’s a delta time in our real world. We’ll call it ‘brain delta time’, or bDt, and our game delta time is just simply ‘dt’. bDt != dt. bDt varies by person. bDt is the smallest perceivable increment of time we can subconsciously detect. bDt may not be constant over time (ie, tiredness and alcohol may have an effect on it) and varies by person.

If we change our velocity to [1,0,0], our position will be Pos += velocity * dt; After 1 second, we’re positioned at [1,0,0]. Our brains aren’t exact measuring devices, so it may perceive the movement speed to an approximation of our actual speed (~0.95f -> ~1.05f). At the instant of movement, our brains begin trying to calibrate our movement speed to a high degree of precision as fast as it can (it helped us evolutionary to run away from lions).
[brainPos] += [brainVel] * bDt;
We know where we’re going to be before we get there. There is a ‘time to calibrate’ for our brains. This is the small slice of time it takes our brain to precisely nail down our velocity (varies by person, by age, and intoxication levels)

[brainPos] != Pos
The delta between [brain Pos] and [Pos] becomes imperceptibly small and unnoticeable as the brain is able to calibrate its sense for velocity against its actual velocity. I’m sure someone could conduct a scientific study to find some population normalized epsilon value which causes motion sickness and discomfort in people, but that’s beyond my scope and resources.

When we set our acceleration to [1,0,0], we begin to get into trouble. Our brains aren’t so good at determining acceleration forces. It can sense velocity to a pretty high precision, but velocity is changing over time.
Velocity += Acceleration * dt;
Position += Velocity * dt;

It takes our brains longer to calibrate our sense of velocity when acceleration is involved. But it can eventually do it. The epsilon error between [brainPos] == [Pos] is much greater and becomes noticeable.
When [brainPos] != [Pos], our brains will unconsciously create a ‘correction’ vector. CorrectionVector = [Pos] - [brainPos];
Epsilon is the distance of this correction vector. Our brains don’t do discontinuous movement, so it will ‘slide’ our brain position along the correction vector over bDt (similar to lerping). The greater the magnitude of CorrectionVector, the more motion sick we feel. The duration for this correction is short, but if the correction is being done constantly, we accrue nausea and the sense of motion sickness gets worse and worse over time.

If you add in ‘surge’, or a change in acceleration over time, we are almost guaranteed to get sick. Surge is a change in acceleration over time (ie, applying jump jets to a flying mech over time).
Surge = [1,0,0]
Acceleration += Surge * dt;
Velocity += Acceleration * dt;
Position += Velocity * dt;

With surge applied, our brains ‘correction vector’ is going all over the place. It’ll look like a compass needle hovering over a wobbly magnet. Our brain will never be able to calibrate its velocity, so it can never predict where it will be at any point in time. This is just not something our brains have needed an evolutionary adaptation to, so accounting for surge is beyond our predictive capabilities. We’d be constantly performing positional corrections, and that causes us to get sick very quickly.

So, bottom line: To reduce motion sickness, you need to let players predict their spatial position to a high degree of accuracy. Your aim should be correction vectors which have very small lengths and occur very infrequently. This needs to account for variation in populations, such that the lowest common denominator doesn’t get motion sick. This is why the recommended ‘walk’ speed is at 1.4m / sec. The recommendation for instantaneous constant movement attempts to minimize the correction vector duration. However, you’re not restricted from using acceleration – the caveat is that players MUST be able to predict their position at all times when accelerating. Let the player control their acceleration rate. Example:

I accidentally created an interesting demo. You are a severed head floating in the air. By moving your head forward, back, side to side, you change your heads acceleration in the respective direction. When you pitch / roll your head, you change your orientation. Because the acceleration was controlled by your own head movements and because it was gradual, it was surprisingly comfortable and usable. I didn’t get sick and neither did my coworker (though, more testing may be required). This was a pleasant surprise which contradicts the widely accepted “acceleration => sick” rule.

Thanks a lot for sharing this, i think your theory about brain prediction of spatial position aligns quite nicely with my own theories and experiments regarding simulation sickness, which i presented about a year ago on the Oculus forums:
https://forums.oculus.com/viewtopic.php?f=32&t=18422

It seems like one of the methods of movement that i implemented (Head-Based Translation) is very similar to the experiment that you describe, and found the same encouraging results after testing it with several people.

A different way of thinking about the same problem, the way that i explained in my post, is that if we apply in VR the time derivative of the action performed in real life (RL), then the mismatch is not jarring to the brain anymore since there is a direct correlation between RL and VR accelerations (and thus the brain can easily predict the next position in VR).



RL   -->  VR

S     -->  S        Head/torso is stopped in RL / Avatar stopped in VR
V↑(A) -->  A↑(Ja)   We start moving (accelerating) head forward / Acceleration increase in VR
V     -->  A        We continue to move head at constant velocity / Constant acceleration in VR
v↓(D) -->  A↓(Jd)   We start stopping (decelerating) head motion / Acceleration decrease in VR
S     -->  V        Head stopped in forward position / Constant velocity in VR

One key factor however is that i believe users should be able to rotate their heads completely independently of their avatar movement (for instance you want to be able to move forward and look to your right at the same time, without affecting the forward motion).

This means that we have to search for ways to control avatar movement that directly affect our inner-ear accelerations in the same direction than our VR movement but that are also independent of head rotation relative to the rest of the body. In my experiments i found that using the base of the neck to control translation and the hips to control rotation works really well, and it’s surprisingly quite similar to RL snowboarding.

The obvious problem is that right now we don’t have a good and reliable way to obtain this information for most users and it has to be hacked using an inverse-head-model/motion-controllers/mocap-suits. The good news is that it’s really just a matter of time until accurate tracking of our whole bodies becomes a standard for consumer VR.

“My hardware: Oculus Rift DK2, NVidia GTX 560…”

Just stop there, you’ll need a better HMD and GPU period. With 90fps as absolute minimum, room scale tracking and very accurate positional controllers you can avoid simulator/motion sickness. For example, after testing with 100s of people using the HTC Vive, most will not experience simulator/motion sickness. Yes, developers should follow best practices (many listed here in this thread) to reduce the possibility of sickness. But these “rules” should not be absolute, exceptions can be made depending on hardware.

I had same problem, thank’s guys for helping!:D:cool::rolleyes:

Is there an answer to why some people get motion sickness while others don’t? Is it because some can predict better?

Have you experienced any difference in motion sickness from using regular movement vs teleporting movement? It seems to me that most people experience the motion sickness once they start moving around the scene? So I’m wondering if using the teleporting movement would help to limit their motion sickness. I personally don’t experience any motion sickness, so it’s hard to test. I know it wouldn’t be applicable for everything, but if teleporting can limit it, maybe that would be useful to use.

I’m one of the lucky few who’s completely immune to simulation sickness. I’ve started to develop a survival horror game and I’m planning on having traditional stick locomotion without a teleport mechanic (because that mechanic doesn’t suit the game), the only concession I’m going to make for those that suffer from sim sickness is to have comfort/snap turning.

It means that less people will be able to play the game but it hasn’t done Dreadhalls, Onward, Pavlov, Doom 3 BFG or Alien Isolation (back in the day when it was playble with a DK2!) any harm.

Different games are more suited to different methods of locomotion at the end of the day, as long as I can keep gameplay smooth without dropping frames I think I should be okay, at least with the people that don’t like a teleport mechanic.

Still at the early stages of development at the moment, just working on the models…one of which I tested last night and was quite happy that I’ve got the scale of them spot-on. :smiley:

Thought id pitch in with my own experiences. Since it’s a super relevant and very person specific issue. Initially when i started with VR just playing games, i got very easily motionsick. Now after i’ve played a lot of games, and doing VR dev myself, i have built up a tollerance to motionsickness (sadly, because it means i don’t qualify as a test person that well anymore). I hear lots of people talk about building their VR legs, but also that you can loose that built up tollerance if you don’t stay in VR regularly.

One observation i made recently, was that i was building a wheelchair simulator for a customer. Initially i was making this rotation mechanic work just by pressing the trackpad button on either left or right controller to rotate the camera. This actually made me motionsick quite a lot. But once i implemented a control mechanic where you grip with both controllers and move one hand forward and one hand backwards, kinda similar to how you’d do while operating a wheelchair, i stopped getting motionsick. My theory is that it’s because my body was performing a move that had a close enough resemblance to how i would operate a real wheelchair, so that it was expecting a rotation. This supports what @Slayemin says about prediction i suppose. A bit more interresting was that the implementation i made had rotation acceleration, so in other words, it was not just an rotate on/off movement, but mimicked the hand movement into rotation closely. So personally even though i hear it a lot, i don’t think acceleration is bad or should be avoided. It just needs to be done in an expected way.

If you don’t have acceleration but it’s unexpected, i’d say that’s bad and should be avoided. So as i started out with, i believe these matters to be highly person specific, and really hard/close to impossible to solve for everybody 100%, but making an experience available to more people i’d say, should rely more on a slow pace into that particular game’s locomotion method, than doing things strictly in just one way (currently teleportation). So realizing that some people will get motionsick from your game/locomotion mechanic i’d say could possibly be worked out by easing them into gradually more and more dynamic moves, and by that i mean, not just over a 2 mins tutorial, but either as an option to tweak heavily on or with gameplay that gradually over several hours, offer more and more dynamic ways of moving around, until you get a hang of it and get used to it.