Working on a movie shot, some textures load in low resolution. I have some questions.
Can I turn off some objects, so that I can work on a detail. Disabling in outliner does nothing, even switching to a new empty scene does not free up gpu memory.
finished scene I wanted to render using High Rosolution option. I assumed that this solution will solve the memory problem, because the render should load only the visible textures. But it does not work either. There are still textures in the scene too low resolution. Why does Hight Resolution not work?
Is there any way to work on a movie shot that has large textures.
For the first question, you can use an Actor Hidden in Game track to your sequencer, and key the visibility for which objects you want hidden.
For the second one, you should be able to set the Movie Render Queue settings to get a high resolution render with additional anti-aliasing and sampling settings.
I understand that if I disable objects in the sequencer they will not load into the card memory?
I render using Movie Render Queue with high resolution render option. But still low resolution textures appear. I understand that the idea of this solution is that the render is divided into smaller fragments, which allows the card to load all the textures from the rendered fragment.
This is a medieval city. Final 4k render, different cameras. I use UDIM and virtual texture. Foreground objects have 2k textures, background objects 1k. Complex objects like cathedral have even 20 UDIM 1k. I assumed that this approach is good because using Movie Render Queue with high resolution render option. Only camera visible textures will load into gpu memory.
Attached is a scene and 4K renders frangemnts with marked low resolution textures.
My card is 3070 8gb, for rendering I will rent a computer with a 3090 24gb - I hope this will solve the problem.
I want to use Unreal to produce film shots. I would like to know the best approach in my case. Assumptions 4k renders with lots of textures.