Hi all,
I’m just a noob when it comes to c++ programming so I wanted to ask a “simple” question while bracing myself for future flames 
Given the strong oop architecture of the engine I was wondering how much performance we are losing on object creation/destruction/manipulation/inheritance etc.
I gather that such complexity is not really suited to imperative languages, but thinking about the early days when engines were made in plain C got my curiosity spinning.
Thanks in advance.
I think the easiest way to answer this is “more than zero, less than a lot.” 
UE has its own mechanisms for getting around some of the more expensive areas of C++ (RTTI being one of the bigger ones, as well as custom memory managers for runtime memory allocation), but there will always be overhead with virtual tables, etc. However, that overhead is pretty small in the grant scheme of things (even in millions of lines of code) and organization/readability/reuse of code is, IMO, greatly increased. You are far more likely to be bitten by a bad algorithm/poor choice of a data structure, than you are to have function overhead hit you in any profiling you do.
A former coworker of mine was given a very complex animation system to optimize and after it was done, he gave us all a presentation on how things went. Assuming this system must already be heavily optimized, he began by doing a bunch of micro optimizations (making sure all the math was SIMD, the data was prefetched and in the local cache, and even wrote ASM blocks to optimize heavy algorithms). After a few weeks he had a 5 - 10% improvement in the system. Towards the end of his task, he decided to take a look at the higher level methods and just to make sure there weren’t any other areas he could possibly optimize. He found a spot, where due to the data structure they were using, they were doing some O(n) or 0(n)[SUP]2[/SUP] iteration. With a simple change he got it to O(log n) and got a huge performance increase(over 50% alone). Many times over all his micro optimizations together and for a much smaller amount of work. The moral of his presentation was “start high and work yourself lower” since 9 times out of 10 the problem, if you have a suspicious hotspot, is probably something simple like a bad algorithm rather than low level object overhead/etc.
1 Like