It is probably a really rare issue, when you have an external graphics card attached to a mac (probably in most cases it is a mac mini as those come with really weak graphics cards), even if you tell the packaged game to use that instead of the built-in one, it ignores it and uses the built-in one. Is there a way to tell from blueprint, or project settings to check if there is an external card attached to the mac and if so, use that instead? In my case, the game runs at 10fps instead of 60fps when it is available for any other app that does not ignore the request from finder to use the stronger graphics card.
This would be very interesting to know but i never heard of such a possibility to use an external card like black magic etc. our experience here is, that the graphic card within the mac mini even overheats with unreal4
so we changed to mac pro and imac.
Yes, it sure is absolutely useless with unreal, but if someone do has an external one attached, idealistically, that would make the built-in graphics card oblivous to the issue. It’s understandable in a way if someone wants to have a mac that performs well and is not ridiculously overpriced.