Hi all,
We’re experiencing issues across our game trying to get Virtual Textures to pre-stream ahead of camera cuts in Level Sequences, in order to avoid the clearly visible pop-in of texture quality. Sometimes we can successfully get these textures to stream in ahead, but sometimes no matter what we try there is still the same blurry texture for the first few frames of the new shot. It most commonly seems to exhibit on character skin virtual textures (in the ‘Character’ texture group), but we also see it happening on some normal objects using the ‘World’ texture group.
We are currently focusing on PS5 so that is where our testing is showing this, but I expect it’s likely happening beyond this also. We are on UE 5.6.1 with no ability to upgrade to 5.7 or beyond but we can cherry pick individual changes if necessary.
This is a list of all the things that we have attempted to use to get these objects streaming in at a decent quality:
IStreamingManager::Get().AddViewLocation ahead of the material/texture streaming - as suggested by Alex Peterson in [Content removed]
GetRendererModule().PrefetchNaniteResource with the mesh render data’s nanite resources - this seems to reliably work to avoid seeing the nanite minimum residency at least
GetRendererModule().RequestVirtualTextureTiles on the render proxy for the material itself, using ScreenSpaceSize as the current viewport size and feature level of GMaxRHIFeatureLevel - this seems to be how MakeHLODRenderResourcesResident referenced by Jeremy Moore in this UDN post operates [Content removed]
On each texture in the material we have tried calling all of the following:
SetForceMipLevelsToBeResident
Setting bForceMipLevelsToBeResident as true - we turn this back off again later
Calling StreamIn on the texture with the max number of mips and bHighPrio as true (this appears to call GetRendererModule().LockVirtualTextureTiles on the VirtualTexture2DResource) - we call StreamOut with 0 mips later
Acquiring the allocated VT on the VirtualTexture2DResource and calling the renderer module’s RequestVirtualTextureTiles on this, using ScreenSpaceSize as the current viewport size, a ViewportPosition in the middle of the viewport, a UV0 of 0,0 and UV1 of 1,1. We have tried passing in a mip level of the max number of mips and 0 (which seems to be the desired value).
We then call the renderer module’s LoadPendingVirtualTextureTiles with a feature level of GMaxRHIFeatureLevel
Setting the texture with ‘Virtual Texture Prefetch Mips’ at the highest mip level
Turning up the ‘Virtual Texture Streaming Priority’ on the texture
We have also tried doubling all our values for the following console variables just to see if that made any difference (but no change was apparent):
r.VT.MaxTilesProducedPerFrame
r.VT.MaxUploadsPerFrame
r.VT.MaxUploadsPerFrame.Streaming
r.VT.MaxUploadMemory
r.VT.MaxUploadRequests
To try and lighten the load on the system (in case that could be the issue), we have tried requesting two levels below max from the ‘Virtual Texture Prefetch Mips’ setting and from LoadPendingVirtualTextureTiles.
When we make these pre-streaming requests we do see a jump in the residency graph’s ‘Page Residency’ and ‘LockedPage Residency’ lines, but this drops back down to the previous level after about 10 frames. We tried ensuring that we only request the textures 10 frames ahead of being displayed, but with no success - the textures still appear blurry on screen while the residency graph line is still at its peak. So while we were worried that the locked pages might not be kept around, it does not seem to work either if we ensure the textures are displayed while the graph suggests they are still locked.
We are all out of ideas, so any advice you can give us on how we can try and help these virtual textures get loaded in before we cut to a close-up of them would be greatly appreciated!
Thanks,
Tom
[Attachment Removed]