This seems complicated functionally to me. Trying to wrap my head around it (probably math wise) to get this working. HELP!
Preface on UMG scale. I’m making a retro 1-bit game. Perfect pixel scaling is crucial for the visual aesthetic. I use a render texture to capture the game world, and relay it onto the HUD widget. (More control over how it looks to the end-user.
I have a function set up so it’ll either scale to viewport with pixel doubling, or be pixel perfect. There’s 1x, 2x, and if higher resolution, 3x of the original 700x400 resolution.
THe problem is, you don’t interact with the world through the Render Texture. There is a click through, using the gameworld behind the UMG widget. Therefore, I need the FOV to be exactly the same as the render texture, no matter the scaling of the UI and viewport.
I currently use vertical fov so it can scale widget wise for ultrawide resolutions.
So if i’m running the game at 1920x1080, running the UI at fullscreen, there’s no change needed for FOV. But if i’m running the UI at multipler 1x, the FOV will have to increase to compensate for the fact the UI is much smaller.
So I guess, the question is, is there a formula to calculate how much the game camera’s FOV has to increase to display the correct image comparable to the UI render texture?
The following are examples of what i’m talking about and the problems I’m having at 1280x720 resolution.
The camera pawn takes a “FOV” variable, which allows me to change the camera’s fov while maintain the original desired FOV for the render texture overlay. So here, it’s 90fov, while others are 100 or even 110. So I need a function that’ll take those numbers, and compensate when the UI scale changes.