Terrible performance with low-res SceneCapture2D with high-end GPU in 4.17 [Seems fixed 4.18]

Has anyone else had terrible performance when using SceneCapture2Ds in VR? And I mean like really terrible. The headset says its getting 90fps. I have an GTX 1080 and Oculus Rift. But it stutters and lags with a 256x256 RT. I tried building a mirror for testing and that was terrible. Now I’m trying to build a spectator screen and its nauseating. I tried limiting the tick rate of the scene capture but that actually somehow makes it worse (not even calling Capture Scene manually; just enabling/disabling capture every frame) than capture every frame.

Am I doing something wrong? Should I be changing properties of the SceneCapture2D for better performance?

Here’s a thread that seems related: https://forums.unrealengine.com/deve…teamvr-in-4-17

EDIT: Tested 4.18 and it seems fixed

@Cobrys An idea to investigate what is happening on your side. This example from Mitch’s VR Cookbook uses a SceneCapture2D actor in VR to demonstrate IK. Could you try it out and see whether that is also not performing as expected?

https://github.com/mitchemmc/UE4VRCo…H7_CharacterIK

By the way, I have also recently experimented with SceneCapture2D actors for a VR drone simulator and I noted that if you set the target texture to a very low resolution you get a lot of artifacts. Increasing the resolution of the target texture makes them go away.

Actually this came up because I followed his instructions for making SceneCapture2D. But I figured the book was out of date. So I looked into the Spectator Screen stuff and ran into the same issue. At low resolution yeah the capture looks terrible but I’m talking about the whole game performing horribly with it running. I had it at 256 (default) and 512 (like in his book) and had these issues. I will look at it again though.

I have the exact same problem as you and it’s really frustrating, as I’m using a blueprint asset from the store in my project which is only compatible with 4.17.

If I remove the render texture / scene capture, framerate jumps back to buttery smooth. As soon as it’s added back in, it’s judder central. I’ve tried playing about with all sorts of VR settings, but nothing will correct this issue for me. If you find any workaround, please let me know. I will report back here if I find a fix as well.

BTW - same project files in latest 4.16 works perfectly.

I get it in the template for this project as well (4.17). VR Expansion Plugin - VR and AR Development - Unreal Engine Forums

Using the provided sample camcorder actor? Because while there is a significant perf hit from enabling a secondary capture I do not get stuttering in my project with it enabled and I am on a 1070 and a low end CPU.

@mordentral I get it when I turn on the mirror. No mirror and it’s normal. If you have a camcorder actor in the level ill have to find that and see. Or is it always on?
​​​​​

Its not in the level, its a spawnable actor. But yeah you will get some frame drops with the mirror, you shouldn’t be getting massive stuttering with a high end system though unless you really upped the screen percentage…Scene capture components are re-rendering the scene so they come at a hefty cost.

You aren’t on oculus are you? Heard there were perf issues with the oculus sdk in 4.17.

Edit Just saw that you mentioned oculus and 4.17 already…nvm my last sentance

I have not upped the screen percentage.

This is probably the thread you’re thinking of? Performance hit and odd stuttering when compared to SteamVR in 4.17? - VR and AR Development - Unreal Engine Forums. I’m going to test in 4.18 in a second to see if it’s fixed there.

Okay just tested the mirror in 4.18 preview 4 and there are no performance issues

This issue is due to assigning a RenderTarget2D to a SceneCaptureComponent’s “Texture Target”.

This is solved by executing the console command “r.SceneRenderTargetResizeMethod 2” in your BeginPlay event that assigns the RenderTarget2D to the SceneCaptureComponent.

Yes this was my issue, nice one XGibbousX. Stick r.SceneRenderTargetResizeMethod=2 into your DefaultEngine.ini. This is the source code entry for the var:


static TAutoConsoleVariable<int32> CVarSceneTargetsResizeMethod(
TEXT("r.SceneRenderTargetResizeMethod"),
0,
TEXT("Control the scene render target resize method:
")
TEXT("(This value is only used in game mode and on windowing platforms unless 'r.SceneRenderTargetsResizingMethodForceOverride' is enabled.)
")
TEXT("0: Resize to match requested render size (Default) (Least memory use, can cause stalls when size changes e.g. ScreenPercentage)
")
TEXT("1: Fixed to screen resolution.
")
TEXT("2: Expands to encompass the largest requested render dimension. (Most memory use, least prone to allocation stalls.)"),
ECVF_RenderThreadSafe
);