Unreal engine defines the global types int64 and uint64, which are signed long long and unsigned long long respectively.
So asides from the perplexing comment in the API documentation (“32-bit signed”??) this causes problems when linking more intimately with openCV, as that library also defines int64 and uint64, but as int64_t and uint64_t. These are sometimes long long, and sometimes long, depending on your platform, but they are always 64 bits.
Is it possible for you to change your definitions to the stdint ones, or can you give me a strong enough argument that long long is better so I can bring that to the opencv guys instead?
(my current solution is terrible and involved renaming the unreal definiton as I didn’t want to risk producing hidden bugs due to things not being the same size)