Standard (non virtual) texture streaming does not properly work with ISM/HISMs, since the manager checks for the component bounds and not for instances’ bounds. I didn’t find references about this in docs.
This is a problem when camera is in the bounds of the ISM but far from instances, such as in the origin when the scene has a lot of far trees near the landscape external edges.
Steps to Reproduce
Put an ISM instance at (VERY_FAR,0,0) and one at (-VERY_FAR,0,0). Bind a material that samples a high res texture to the component. Put the camera in 0,0,0. Look a the “stat rendering” output. Since you are in the bounding box of the component, the highest mip is needed.
In the attached zip you can find 4 asset to place in the Content folder. Open ism_streaming_test scene.
Enable Stat Streaming
Hide everything but “Cube” and move the camera to the cube (use unlit viewmode): VisibleMips is 344 MB.
Hide Cube and make Cube2 and Cube3 visible (2 cubes at +-10000): VisibleMips is 2 MB.
Hide the Cube2 and Cube3 and show Actor2 (an instanced version of Cube2 and Cube3): VisibleMips is 344 MB.
We were unable to reproduce the issue with your test assets. The supplied map file would not load on the latest version of UE 5.7, and re-creating the setup based on your description resulted in a the VisibleMips stat remaining at 22.29 mb in all scenarios.
Can you please send us a test project that demonstrates the MIP streaming, and include screenshots of the issue? Please include the .uproject file, the entire Content and Config folders, and, if applicable, the Source folder.
thanks. I will look into it. However from what i see, the solution is not a proper one since it does’t cover nanite rendering, nor it has entered in your codebase.
Apologies for the delay, I’m still catching up on issues from after the break but I’ve begun to investigate this and hope to have more information or an official issue to track this early next week.
1) some elements do not support virtual textures (translucents, rvt, canvas 2d). probably more (scene captures?). a complete list in docs may help us.
2) we have to modify existing materials to support vt since the same sampling material node does not work on non-vt textures. a low-level solution may help us (hlsl translator may generate vt and non-vt sampling code based on real usage on assets in contentlibrary, such as the actual material usage in EMaterialUsage enum).
3) we have to carefully rethink our material library to split it in vt and not-vt materials. after that (due to issue 1) we have to modify texture parameters in material instances to move the textures bound to non-vt params to vt params (vt and non-vt material graph sampling nodes can’t share the same param name)
4) modify textures sampled by vt materials, identifying textures shared by vt and not-vt materials. A low -level solution may help: for a texture used by a vt and non-vt material engine may generate vt and not-vt version of the texture, showing a red warning in the texture editor
5) recheck performances and budgets and fix new bugs (so, production time)
Thank you for providing the details, I’ve added them to our internal tracker.
some elements do not support virtual textures (translucents, rvt, canvas 2d). probably more (scene captures?). a complete list in docs may help us.
5.6 added VT support for translucents, UI, volume. Canvas and scene capture also work, but sometimes require warm up phases rather than one off captures. VT can now be used by RVT, which used to cause RHI validation errors, but it’s still a bad idea due to the double level of caching required.
Good call out on the docs not reflecting this information - we should update that.
we have to modify existing materials to support vt since the same sampling material node does not work on non-vt textures.
When creating or editing a material the correct sampling logic should be used depending on whether the texture is virtual or not. When changing a texture to/from virtual, the associated materials need to be updated. This is done for you when using the virtual texture conversion tool from the content browser, and since 5.6 it also happens when converting in the texture editor.
we have to carefully rethink our material library to split it in vt and not-vt materials. after that (due to issue 1) we have to modify texture parameters in material instances to move the textures bound to non-vt params to vt params (vt and non-vt material graph sampling nodes can’t share the same param name)
This is one of the things that makes it difficult to combine virtual and non virtual textures without some quite explicit content management. Virtual texture parameters must use virtual textures, and regular texture parameters must use regular textures. We considered not having this restriction, but then material instances could generate many more shader permutations without the content author being aware. Note then when using the virtual texture conversion tools, the underlying conversion does try to fix things by flagging textures that are set to the same material parameter slots to all be converted in the same way. But even with that it helps to have a content plan where certain groups of things (environment/character/etc) are treated in a uniform manner.
Converting to virtual textures is a big decision, but for projects that fully convert we do see large memory savings
2) How do I warm-up virtual textures? An explicit call (ie ForceResident()) would be great. Multiple updates (for static canvases) or multiple scenecaptures (for “static” non every-frame updating scenecaptures) not so much.
I recently gave some advice on resolving issues with blurry virtual textures when loading in, teleporting or starting a cinematic which is a common reason to warm up virtual textures: [Blurry RVT Tiles After Load or [Content removed]
Usually the textures for the location need to be available and streamed in by telling the Streaming manager about the new location with IStreamingManager::Get().AddViewLocation() or by PrestreamTextures(). We also added URuntimeVirtualTextureComponent::RequestPreload to request preloading by RVT in a specific world bounding box.
I’m passing these questions to my colleague who is more familiar with the system.
For 1) It turns out Thin Translucent specifically has an bug that I haven’t seen reported before. Thanks for letting us know. I’ll fix for our next engine release. The issue was a missing inout specifier on the MaterialParams parameter in AccumulateThinTranslucentModel().
For 3) your summary is correct. We don’t recommend using VTs to feed an RVT for the reasons stated, even though it is now possible to set up materials that way.
1) VT on translucents don’t seem to stream properly: I just see the 64x64 mip (If I attach the vt sample to TransmittanceColor, it works if I attach it to BaseColor).
2) How do I warm-up virtual textures? An explicit call (ie ForceResident()) would be great. Multiple updates (for static canvases) or multiple scenecaptures (for “static” non every-frame updating scenecaptures) not so much.
3) I still get an error when using VT during RVT output (HLSLMaterialTranslator.cpp: “Virtual Texture samples are not allowed during Runtime Virtual Texture Output.”
3) Looking at the actual code, I saw a bRelaxRuntimeVirtualTextureRestrictions that can be set to 0. However the comment says “no guarantee that Virtual Textures will be ready”. So, I can attach a VT but then I don’t know if it will be streamed.