Returns the correct texture pointer when run in the editor, and everything works fine.
However, when I run my project as a standalone or after packaging, it returns a null pointer. I’m guessing the viewports and rendertargets are set up differently in standalone mode compared to regular mode. How can I achieve what I want, so it works in all cases?
Okay, so I managed to get non-null texture pointer, by moving the code to the Draw() method of a custom UGameViewportClient, but the texture I get is not being rendered to when playing as standalone (I get a completely black texture). The exact same code works fine in PIE-mode.
I did in fact solve this! To convert the pixel format, you can sample the backbuffer using a simple pixel shader and draw it on a new render target with the desired size and format. Seems to have close to zero impact on performance.
VinnyBlood’s method can work,but if you just want to get the texture manually somtimes but not every frame,see about this.
lets go in code.
GetGameViewport()->GetRenderTargetTexture will return the member RenderTargetTextureRHI in client,SceneViewport.UseSeparateRenderTarget() decide whether the RenderTargetTextureRHI is valid or not(FSceneViewport::InitDynamicRHI),wo if you want the texture,you gonna to make SceneViewport.bUseSeparateRenderTarget ture.if this flag value set as false,render result will send to the backbuffer directly.And this value setted in construct function with ViewportWidget(SWiewport) member parameter.
and as a violent way you can simply set GameViewportWidgetRef’s construct paramater RenderDirectlyToWindows to false in UGameEngine::CreateGameViewportWidget(),and every thing about this will be same with the editor.
However,you couldnt access to viewport dynamicrhi(just like editor),and you will get a upset down viewport in android.
It may have a runtime method to change the setting so you will not needed to change the engine source code,I dont have enough time to find it out right now.