Human eye sees the motion blur. It’s just that for humans it’s borderline impossible to keep eye target fixed to the head, so shaking the head fast is terrible experiment (also not good for brain health). Eye focal point constantly anchors to static points. But if human eye had no motion blur, you would be able to see individual propeller blades of flying helicopter or even common household fan perfectly sharp at all the times. I mean just look inside your computer case at any of the spinning fans. Or just relax your wrist and swing your hand back and forth really fast. There! You are seeing the motion blur.
Honestly it surprises me that you, as someone with 3220+ posts on this forum, miss such a basic understanding of human vision to not even realize you are seeing motion blur all the day every day.
Motion blur IS NOT bad, but it has gained a really bad reputation among games because vast majority of game developers had little to none computer graphics backgrounds and thought that motion blur is something under artistic control with artistic licence. And that’s why I am shocked that even Unreal Engine, which is pushing the boundaries in terms of physical accuracy and photorealism has repeated the same mistake.
The main idea is that if motion blur is done **right, **then it’s perceived as a smoothness of image motion instead of blurriness. If motion blur is done wrong, then it’s perceived as blurry image and user/player has an inclination to turn it off. The right motion blur is hard to spot, simply because it looks natural. But the point is that it’s not about the amount of it, it’s about the amount in relation to the framerate.
Here’s an example in my game. First, this is the WRONG default Target FPS value of 30 Epic decided to use for some reason. You are 4x the real amount of realistic motion blur at 120FPS (the default in editor framerate cap):
Now here is the **CORRECT **motion blur Target FPS value of 0, which derives the motion blur amount from framerate. At 120FPS in such a fast motion, your the amount of motion blur relative to that is so small that your eyes won’t perceive it as a blur, but instead as a smoothness of motion, which is the entire point of motion blur:
Motion blur is not bad. What is bad is usually the lack of experience of game engine developers who implement it (I don’t mean Epic here, as they did it right aside the wrong default value. Rather I mean the smaller game developers having their own engines), and game engine artists who use it. Motion blur is something that should be enabled and kept at right values, not something that has to be tweaked for the right look. And that’s where the main issue I talk about is. If even the default values engine developers decide on are wrong, then no wonder Motion Blur has such a bad reputation, when almost always, all the conditions are set up to ensure almost no one uses it correctly.
In fact, one more experiment I would like to point out. Set the Target FPS to 0, for MB to be correctly derived from framerate, and then use t.MaxFPS console command to limit framerate of your game, to for example 60, 40, 30 and then 20 FPS. Then at each of those framerates, try to enable and disable motion blur by toggling the amount between 0 and 0.5 (default). You will see that especially at lower framerates, the motion blur will make the motion of your game appear much smoother, even at framerates as low as 30, compared to disabled motion blur, where the motion will look stuttery and choppy.
Just keep in mind to always leave motion blur amount at 0.5, which equals to commonly used 180° shutter angle (half a frame). Never ever use amount of 1.0 unless you are going for that ugly, Argentinian soap opera look.
And if you really, really need artistic control over the MB amount, then always keep Target FPS at 0, to keep it framerate relative, and use MB amount value to tweak the amount, but stay in range between 0 and 0.5, never cross 0.5. For example amount of 0.25 should simulate 90° camera shutter angle (or 0.25 frame shutter speed). 