Pause rendering and continually display the last-rendered frame

Is there a way to pause rendering and continually display the last rendered frame? I’m starting to prototype an enterprise application that runs via Pixel Streaming, and I’d like the application to consume as little GPU as possible if the user is not actively interacting with the application.

I don’t want to show a black viewport or anything, and I do want anything that is not the rendering to continue on. If the viewport isn’t changing frame-to-frame, I just want to stop consuming GPU resources to render the same frame over and over again.

Any clues? Being an enterprise application there aren’t going to be any animated things like shaders that change frame to frame even when the geometry is motionless, so I will be able to set a flag when things aren’t moving and unset it when things do move, so that rendering pauses are invisible to the users.

C++ and Blueprint solutions are both fine.

Thank you.

What you could do is rendering to a renderTarget and show that via a widget. I am not sure if there is a PauseRenderer command, but at least you can set t.maxFPS 1 which will only render one image/sec in the background.
You then can recover to interactive rendering by “listening” to various events like key inputs or camera movement in general.

ok the Max FPS idea is a good one. I had not considered that. Thank you.

AFAIK, the command to freeze rendering is simply FreezeRendering.

2 Likes

FreezeRendering is only working in Editor and its not really Freezing the Rendering it just freezes the culling, but rendering will continue…

Is there a master who can add

There’s no better answer to this question? Setting max fps to 1 one make the pause UI to run at 1fps.
Usualy when you pause a game to open a menu, fps jump really high because the background rendering is frozen but that’s not what happens in UE4 with the pause command

I would use a combo of things.

First, take the screencapture to use on the UI.

Second, transition to an empty level or an empty part of the same level (so things don’t necessarily need to unload and re-load).
The inside of a closed box, 1 drawcall. I wouldn’t mess with the light settings or anything, since it can potentially cause more drag on the CPU.
Actually. you can spawn the box around the camera and destroy it afterwards… you just need to make sure that it culls the outside meshes.

Third, setting the game time slomo to 0. (Set gametime dilation is the node i think).
menus can be set to ignore the dilation.

Effectively pausing the movement of everything makes it so that no updates need to run. On anything. Even cpu wise technically. I believe that the regular tick group is affected, so you should be freeing up the CPU with that.
Obviously you need to test to see if this is true (stat unit can tell you, but also running the game and looking at the cpu % on task manager may be better).

That’s about it.
going back to normal would be as simple as destroying the box and returning dilation to 1.

one last note.
I dont think we have this in BP. So I’d use C++ to use this specifically to run the sequence of events

https://docs.unrealengine.com/en-US/API/Runtime/Engine/Engine/UGameViewportClient/LostFocus/index.html

That’s a lot of “dirty tricks” while most engines can simply freeze world rendering

Functions are not set to “tick while game pause” by default so the “pause” command stop every tick by default.
The problem is only about GPU keeping rendering everything even when the game is paused.

Time dilatation is not affecting rendering, only animation/tick speed but a frame will take the same time to render

I am not to sure about the rendering… thing is, you want to have a pause where only part of the rendering is rendered, like the Menu, but the rest is not.

Thing is, you want to have a pause where only part of the rendering is rendered, like the Menu, but the rest is not. And the decion what to render and what not can not be done by the Engine because it simply just don’t know your requirements.

But to come back to the initial Pixelstreaming question: There is a new feature “freeze Stream” in the engine which i would use and then set the max fps to 1 and pause the game. Should save you enough resources and you can do this very aggressively.

Best
realVis

It’s an old post, but if anyone is looking for some kind of answer/hint - there is a way to stop rendering - use SetEnableWorldRendering node.
image

It does help to save on performance - while the player cannot see the actual world anyway, no need to waste resources on rendering it.

But the problem is, it will not automatically display the last rendered frame - you will be seeing a black screen instead. So before stopping rendering, you can spawn a scene capture 2d actor, create a render target texture with viewport size, set that texture as scene capture’s target, then capture scene and disable world rendering. Then only thing left to do is to display that texture as a UI background.
A very very simplified blueprint for this will look smth smth like this

But there’s actually a lot more to consider. For example, before capturing scene, you need to configure scene capture component to match player camera (e.g. post processing, fov), otherwise the resulting image will be notably different. It’s a complicated thing, but it can be done.

I’m still interested if it’s possible to simply get the last rendered frame, that’d be much easier to work with.

I have also found this while searching on the problem, maybe that would be a better approach:

2 Likes

Not from blueprint, but if you break into the engine code you can pull the last rendered frame.

There is a FTexture2DRHIRef which is the next frame rendering (or the current, I forget).
You could use that directly as the render target or texture and save the overhead and process of capturing with a scene capture component, but it isnt something easily done by everyone like a full BP only solution.

Certainly there are many ways, the frame is there afterall.

The problem(s) you may potentially find is the access level of the function(s) being protected instead of public or similarly unavaliable outside said pipeline.
This would require editing the engine’s source and building from it to get it to publish your final EXE with the correct system to access that data…

On the plus side, if the functions arent protected, you could just make a c++ class to make a BP node that pulls that same frame information and use it in blueprint however you like…

Out of pure interest I was trying to make it work by actually getting last rendered frame, and found a function GetViewportScreenShot on viewport, it does just that.

However, I then wasn’t able to correctly determine how to use that data. The only thing I managed to achieve is copying that data onto provided TextureRenderTarget2D. That works just fine (resulting quality isn’t great, but good enough). However, the drawing part turned out to cause a notable freeze, so I ended up not using it.

Maybe someone else can figure it out better? I’m pretty sure this entire approach isn’t right.
Please note that I had absolutely ZERO idea what am I doing here.

#include "Kismet/KismetRenderingLibrary.h"
#include "Engine/GameViewportClient.h"
#include "Engine/Canvas.h"
#include "CanvasItem.h"
#include "Slate/SceneViewport.h"

#include "ViewportFrameCapture.generated.h"

UCLASS()
class UViewportFrameCapture : public UObject
{
	GENERATED_BODY()

	UFUNCTION(BlueprintCallable)
	static bool GetLastRenderedFrame(UObject* WorldContext, UTextureRenderTarget2D*& OutTexture)
	{
		if (!OutTexture)
		{
			return false;
		}

		TArray<FColor> BitMap;
		FVector2D ViewportSize;
		bool bReceivedColorBuffer = false;
		if (GEngine && GEngine->GameViewport)
		{
			if (FSceneViewport* Viewport = GEngine->GameViewport->GetGameViewport())
			{
				// read last rendered frame
				bReceivedColorBuffer = GetViewportScreenShot(Viewport, BitMap, FIntRect(0, 0, 0, 0));
				// find viewport size
				GEngine->GameViewport->GetViewportSize(ViewportSize);
			}
		}

		if (!bReceivedColorBuffer)
		{
			return false;
		}

		// set up pixel format for render target
		const EPixelFormat RequestedFormat = FSlateApplication::Get().GetRenderer()->GetSlateRecommendedColorFormat();
		OutTexture->InitCustomFormat(ViewportSize.X, ViewportSize.Y, RequestedFormat, true);
		OutTexture->UpdateResourceImmediate(true);

		// create canvas object to draw on
		UCanvas* Canvas;
		FVector2D CanvasSize;
		FDrawToRenderTargetContext Context;
		UKismetRenderingLibrary::BeginDrawCanvasToRenderTarget(WorldContext, OutTexture, Canvas, CanvasSize, Context);

		// this partially repeats BeginDrawCanvasToRenderTarget, except we create canvas in DeferDrawing mode,
		// otherwise rendering thread will freeze for several seconds in CreateRHIBuffer
		UWorld* World = GEngine->GetWorldFromContextObject(WorldContext, EGetWorldErrorMode::LogAndReturnNull);
		FTextureRenderTargetResource* RenderTargetResource = OutTexture->GameThread_GetRenderTargetResource();
		FCanvas* ParamsCanvas = new FCanvas(
			RenderTargetResource,
			nullptr,
			World,
			World->FeatureLevel,
			FCanvas::CDM_DeferDrawing);
		Canvas->Init(OutTexture->SizeX, OutTexture->SizeY, nullptr, ParamsCanvas);

		// TODO: this is unoptimzed, causes about 0.5s freeze. There must be a better way to copy bitmap onto canvas, but I wasnt able to find it
		for (int i = 0; i < BitMap.Num(); i++)
		{
			const FVector2D Pos(i % (int)ViewportSize.X, i / (int)ViewportSize.X);
			FCanvasLineItem Item(Pos, Pos);
			Item.LineThickness = 1.0f;
			Item.SetColor(BitMap[i]);
			Canvas->DrawItem(Item);
		}

		UKismetRenderingLibrary::EndDrawCanvasToRenderTarget(WorldContext, Context);
		return true;
	}
};

Thanks a lot!