How to modify the internal rendering viewport? Like for example (glViewport and glScissor) calls.

We are making a 2D pixel perfect game. Our base pixel perfect resolution is 640x360.

This allows us to have a 1280x720 resolution that is pixel perfect (base res multiplied by 2)
This allows us to have a 1920x1080 resolution that is pixel perfect (base res multiplied by 3)

The problem is that we want to support “virtually” every resolution that user has using a rendering viewport that is multiple of our base resolution (640x360) so it’s always pixel perfect and using black “bars” top,down,right and left margins.

We always did that (using our engine) using a combination of glViewport() and glScissor() calls before drawing. But here in unreal we don’t know where to start.

Any experienced programmers with unreal to enlighten us with an elegant solution?
What we have tried:

    • We tried the keep aspect ratio setting from the camera component. But this is not exactly what we need, because it allows rendering viewports that are not pixel perfect while maintaining the aspect ratio.

DOESN’T WORK.


// The native resolution of the pixel art
const FVector2D ReferenceResolution(640.0f, 360.0f);

// The pixels per unit (translation factor from pixels to world units, in our case 32px (a tile) = 100cm)
const float ReferencePixelsPerUnit = 0.32f;


void ASMPlayerCameraManager::UpdateViewTargetInternal(FTViewTarget& OutVT, float DeltaTime) {
    Super::UpdateViewTargetInternal(OutVT, DeltaTime);

    UpdateCameraWidth(OutVT.POV);
}

void ASMPlayerCameraManager::UpdateCameraWidth(FMinimalViewInfo& OutCameraView) {
    if (GEngine == nullptr) {
        return;
    }

    const UGameViewportClient* GameViewport = GEngine->GameViewportForWorld(GetWorld());
    if (GameViewport != nullptr && GameViewport->Viewport != nullptr) {
        // Get the viewport size
        const FVector2D ViewportSize(GameViewport->Viewport->GetSizeXY());

        // Calculate the new orthographic width based on pixel art scale and viewport size
        OutCameraView.OrthoWidth = (ViewportSize.X / ReferencePixelsPerUnit) / GetPixelArtScale(ViewportSize);
    }
}

float ASMPlayerCameraManager::GetPixelArtScale(const FVector2D& InViewportSize) {
    // Calculate the new art scale factor
    float BasePixelArtScale = (InViewportSize.X / ReferenceResolution.X);

    // Round it up or down
    BasePixelArtScale = (FMath::Frac(BasePixelArtScale) > 0.9f) ? FMath::CeilToFloat(BasePixelArtScale) : FMath::FloorToFloat(BasePixelArtScale);

    // In the extremely rare case where the display resolution is lower than the reference resolution we
    // also need to protect against divisions by zero, although in this case the game will be unplayable :)
    BasePixelArtScale = FMath::Max(1.0f, BasePixelArtScale);

    return BasePixelArtScale;
}


This is pixel perfect 640x360:

PixelPerfect640x360.png

Now we resized the window :

And the ortho fix, works fine for the World! except that it renders things that should not render (we fixed this with glScissor in our own engine, don’t know in unreal…) and the UI is not getting modified by that.

Doesn’t work… :frowning:

So again… Any experienced programmer with unreal can help us with that? Thank you :slight_smile:

I would render the entire game into a 640x360 render target instead, then use UMG to display it on the screen, where it would be possible to precisely control its size. As a bonus, the game would have the same performance no matter how large the resolution is. You’re pretty much guaranteed 60fps on Switch too doing that.

How can you render the UI elements to that render target? and Then, How to prevent the world being rendered by the camera after the rendering it to the render target?

Unreal supports dynamic resolution on consoles:

See " Replacing Dynamic Resolution Heuristic in C++ "

Thank you for the information. But I think we don’t need dynamic resolution.
We need a way to modify the final viewport (where the world will be rendered) inside a window to maintain a viewport size that is multiple of our base resolution (in order to be pixel perfect). Of course that with some resolutions some of the borders will be black, but this is not a problem.

Put your entire UI and the image with the game “viewport” inside an Overlay widget (a widget that stacks it’s children on top of each other), then control the size of that widget.

To prevent the “true” camera from showing anything, well, just point it away from your world. It’s 2D, after all, if the camera is looking the other way, it will display black. The SceneCapture2D becomes your “real” camera.

Thanks Manoel.Neto. but, have you tried this solution in a game or something?
If no other option is given… we will try this. But we never used a rendertarget before >_<

Thank you.

I would like to thank Manoel.Neto. We finished implementing that and it works very well!
We had to do a custom rule for UI DPI scaling based on our base resolution that gives multipliers without decimals for the UI.
We had to make a base widget that renders the render target and attach all the other elements to it :).
After solving some bugs, we have it working now thank you!

Here is the custom DPI scaling code if someone needs it:



// FIntPoint BaseResolution is a variable that contains the base resolution the game is rendered to make the multiplier exact.

float UAPFitResolutionDPIScalingRule::GetDPIScaleBasedOnSize(FIntPoint ViewportSize) const {
    const FVector2D vpsize{ float(ViewportSize.X), float(ViewportSize.Y) };
    const FVector2D baseres{ float(BaseResolution.X), float(BaseResolution.Y) };
    const FVector2D multi = vpsize / baseres;

    // Get the min (that will fit)
    auto scaling = FMath::Min(multi.X, multi.Y);

    // If the
    if (scaling >= 1.0f) {
        scaling = FMath::FloorToFloat(scaling);
    } else {
        // This will not be pixel perfect but this is
        // because the screen is smaller than base resolution
        scaling = scaling;
    }

    return scaling;
}