Performance cost of streaming virtual textures

Hey, I did currently some research about streaming virtual textures and how it enables you to optimize memory usage, where you pay with performance. And that´s what I am actually wondering about.

Everywhere I read, that SVTs are more performance expensive, but I couldn´t find specifc examples, how much more expensive it actually is.

So my question is, if there are some concrete numbers/stats/tests how much more expensive streaming virtual textures actually are? Maybe some of you guys already have some working experiences with it and can help me out here?

There aren’t. Its entierly dependent on setup and system.
You have to run your own benchmarks like everything else in unreal.

Afaik you get 0 benefits from VTs when you don’t utilize the system to create or use gargantuan base textures.

And when you do create oversized packed textures, the question really becomes why.
Is it simply so that your assets won’t be compatible with 90% of the engines and stuff thats out there?
Yea, probably… :stuck_out_tongue:

They dont support all platforms (mobile, switch, etc).

It don’t cost the cpu almost nothing (from my experience and lots of profiling). And on all the projects that I have worked on in the last 20 years as a developer (techart, ue), there was a problem with memory. You will save lots of promblem with it. Rspecially on small or middle size projets.

They can cost more in shaders. But
but nanite meshes with a new rasterizer are faster.

Of course, if you were doing a large 200gb AAA open world project, it will cost you a lot to learn how to optimize it as best as possible - workflow, settings (eg they have separated pools for bc1, bc7, etc).
From an aesthetic point of view, they are not best for masked materials - if you quickly fly through the environment, the mask will gradually begin to load the blocks. You don’t see it so much with the normal texture, but the mask with the blending detail texture will start to popup details in your view and its much more noticable. But on the other hand, it’s not as terrible as with normal blurred textures, which are blurred and panic mode in streaming will start to cause stuttering.

they can cause you to start using 8k textures for the prop, which is not a problem with the performance, but mainly with the size of the screen will increase a lot (filling tens of gb is not a problem).

I cant imagine work without nanite and vst. It save lots of time and lots of problems. But you can experienced few new and you need to change workflow little bit.

They have more cost in term of materials. But nanite meshes are faster in

and if you happen to need to combine models with more unique textures into one material (eg. For less drawcalls), no problem. With classic textures, however, you always read the entire texture, even if you only see a few pixels of it.

And at the end, you can change your textures down by max texture size (not lod offset) in their settings even in batch.

Btw: but if you were asking from the point of view of marketplace - it is better to use normal textures. People will go through those assets anyway and made for they needs. It is better to convert normal textures to vst than the other way - can be problematic.

1 Like