Man, I was typing out one long post nobody’d read. I’ma just shorten it.
Instead of fighting over which GPU company has worse practices, why don’t you consider the possibility that if a game comes out heavily optimized for one card family and not the other, that the developers may have simply screwed up and didn’t optimize properly? There’s only so much driver optimizations can do (and believe me, they can do A LOT; look at how nVidia’s driver optimizations with DirectX 11 eventually outperformed AMD’s mantle in Thief 2014), and if driver optimizations don’t fix the problem after a couple months, especially if it’s a higher profile game, then let’s look at the devs for neglecting one card family, no?