Unrecognized type 'int'

What’s going on here? Also, why is there an error on GENERATED_USTRUCT_BODY() macro?


entire file:


// © 2015 Dirt Productions. All rights reserved.

#pragma once

#include "GameFramework/Info.h"
#include "GameInfo.generated.h"

UCLASS()
class DISTANTHOME_API AGameInfo : public AInfo
{
	GENERATED_BODY()
};


UENUM(BlueprintType)
enum class EWeaponSlot : uint8
{
	NoWeapon UMETA(DisplayName = "No Weapon"),
	PrimaryWeapon UMETA(DisplayName = "Primary Weapon"),
	SecondaryWeapon UMETA(DisplayName = "Secondary Weapon"),
	MeleeWeapon UMETA(DisplayName = "Melee Weapon"),
	Ability UMETA(DisplayName = "Ability")
};

USTRUCT(BlueprintType)
struct FWeaponAmmoData
{
	GENERATED_USTRUCT_BODY()

	UPROPERTY()
		int MaxLoadAmmo UMETA(DisplayName = "Max Load Ammo");

	UPROPERTY()
		int MaxCarryAmmo UMETA(DisplayName = "Max Carry Ammo");

	UPROPERTY()
		int LoadAmmo UMETA(DisplayName = "Load Ammo");

	UPROPERTY()
		int CarryAmmo UMETA(DisplayName = "Carry Ammo");

	UPROPERTY()
		int AmmoFailLine UMETA(DisplayName = "Ammo Fail Line");

	UPROPERTY()
		int AmmoToSpend UMETA(DisplayName = "Ammo To Spend");

	FWeaponAmmoData()
	{
		MaxLoadAmmo = 30;
		MaxCarryAmmo = 150;
		LoadAmmo = 30;
		CarryAmmo = 150;
		AmmoFailLine = 0;
		AmmoToSpend = 1;
	}
};

USTRUCT(BlueprintType)
struct FPlayerAmmoData
{
	GENERATED_USTRUCT_BODY()

	UPROPERTY()
		struct FWeaponAmmoData PrimaryWeapon;

	UPROPERTY()
		struct FWeaponAmmoData SecondaryWeapon;

	FPlayerAmmoData()
	{
		FWeaponAmmoData DefaultAmmo;
		DefaultAmmo.MaxLoadAmmo = 30;
		DefaultAmmo.MaxCarryAmmo = 150;
		DefaultAmmo.LoadAmmo = 30;
		DefaultAmmo.CarryAmmo = 150;
		DefaultAmmo.AmmoFailLine = 0;
		DefaultAmmo.AmmoToSpend = 1;

		PrimaryWeapon = DefaultAmmo;
		SecondaryWeapon = SecondaryAmmo;
	}
};

You shouldn’t use the C++ int type, Unreal has it’s own integer types: Properties | Unreal Engine Documentation

Use int32 instead.

Also, Visual Studio doesn’t understand Unreal specific stuff. The error you’re getting is because your macro doesn’t have a ; (but don’t add one), once you begin compiling it’ll be fine. But Intelisense thinks that it’s an issue when it’s not.

But Mosel3y/Robert are right, don’t use the standard int. You should be using Unreal’s version of that. int32,int16,int8,uint32,uint16,uint8. Those are the common ones that are setup to work with Unreal.

Reason #12038712 to get Visual Assist X!

Definitely. You should try out Visual Assist from Whole Tomato Software.

The reason you should use the Unreal Engine version of int is for cross-compatibility. When compiling to different targets, the default length of primary literals (usually defined by the hardware) like int and char may change, which could cause unexpected behavior. For instance, int may be 16-bit on some platform you could compile to, which would cause the value of any int you have to wrap around at 65535 for unsigned, and half of that for signed. I can think of many cases where you will exceed a few tens of thousands in number size.

Intellisense and compiler errors are different things.
Ignore the Intellisense for now, and concentrate on getting things compiling.

If the compiler (not Intellisense) complains that “int” is unexpected, changing it to “int32” won’t help; instead figure out why any type is not expected at that point – unclosed declaration, missing semicolon, forgotten parentheses, etc.