I’m dividing by 38 because unreal’s units are equivalent to 1cm and 1cm is kind of 38 pixels. So i’m dividing the viewport size by 38 to convert it to units so i can set the scale. When i do that, the actor and the mesh fill the screen but even when i divide by 100 or higher number it keeps filling the whole screen.
What i want to do is to scale the actor and set some margins between the viewport and the actor, Imagine if it’s a plane or a cube, i want it to fill the screen minus 10% of the viewport size to each side.
How could i do that? is my approach a good approach? What better solution would you recommend?
First of all, the call of SetWorldScale3D after SetActorScale3D should be kinda reduntant, asuming your FieldMesh is a subcomponent of your Actors RootComponent, or is itself the RootComponent.
Furthermore, the Scale Value should be a multiple of the Standard Value. Imagine this:
Your Mesh is made by an Designer and is imported into UE4 such that it is 50cm70cm30cm (XYZ). This will be his standard Value when placing into the World → Scale = 1.
So if you call SetActorScale3D(1.5, 2, 0.5) on a Actor with this Mesh Component, your Mesh will be 75cm140cm15cm (501.5, 702, 30*0.5).
I would suggest you, to take any reasonable Screensize Value as your Standard Value.
For example let FVector2D(640.f, 480.f) be your standard Value.
Now just divide the Components (.X, .Y) of your actual ScreenSize by the Components your Standard Value and you will get appropriate Scale Values.
Please let me know if this works, or if anything is total ■■■■■■■■ i just wrote.
After that, to resize the actor I’m trying to get the actor bounds (the real size in units and taking that as my standard value, so i have the following:
So far so good, the actual scale is the same as the viewport, but when i try to set the scale to that of the viewport minus for example 20% it’s still taking the whole screen
this->FieldMesh->SetRelativeScale3D(NewScale * 0.8);
the above line doesn’t work, for it to leave space between the viewport and the mesh i have to multiply for something like 0.4 and that doesn’t seem to make sense to me.
Where do you implement this code? I would guess in BeginPlay()? If you implement it instead into some tick function (to resize the Actor on Game Runtime), then the GetActorBounds would be fatal, because it changes everytime when SetRelativeScale3D is called.
Another thing i can see is, that you divide the Width and Height directly by your FieldSize.X and Y. This can not work out… Here is why:
Every Vector is only meaningfull with respect to some basis (origin) which span his vectorspace. The basis for FieldSize is the World Origin of your Map with the orthogonal unit Vectors (length one UE = 1 cm) X, Y and Z. The basis for Viewportsize on the other hand is the upper left (i think) corner of of the viewport with the orthogonal unit Vectors (length one pixel = ?? cm).
Because their basis is not the same, you cant just simply divide them and get a meaningfull result.
I would recommend that you make 2 new UPROPERTY in your actor which are StandardWidth and StandardHeight (both represent Vectors w.r.t. your screen) and use them as your standard values. With these Propertys you can play around in the Editor to Resive the size you want on your viewport.
Then, if you scale your viewport everything should just scale fine.
Note that these Standard Values will not represent the actual size on your viewport at all.
You can also look at ProjectWorldLocationToScreen (is a function inside APlayerController) with this function and GetActorBounds you could possibly get the real pixelsize of your actor, which than could be resized in such a way that it will allways be some proportion of your viewportsize.