Also, Visual Studio doesn’t understand Unreal specific stuff. The error you’re getting is because your macro doesn’t have a ; (but don’t add one), once you begin compiling it’ll be fine. But Intelisense thinks that it’s an issue when it’s not.
But Mosel3y/Robert are right, don’t use the standard int. You should be using Unreal’s version of that. int32,int16,int8,uint32,uint16,uint8. Those are the common ones that are setup to work with Unreal.
Definitely. You should try out Visual Assist from Whole Tomato Software.
The reason you should use the Unreal Engine version of int is for cross-compatibility. When compiling to different targets, the default length of primary literals (usually defined by the hardware) like int and char may change, which could cause unexpected behavior. For instance, int may be 16-bit on some platform you could compile to, which would cause the value of any int you have to wrap around at 65535 for unsigned, and half of that for signed. I can think of many cases where you will exceed a few tens of thousands in number size.
Intellisense and compiler errors are different things.
Ignore the Intellisense for now, and concentrate on getting things compiling.
If the compiler (not Intellisense) complains that “int” is unexpected, changing it to “int32” won’t help; instead figure out why any type is not expected at that point – unclosed declaration, missing semicolon, forgotten parentheses, etc.