LogVulkanRHI: Warning: Failed to allocate Device Memory, Requested=262144.00Kb MemTypeIndex=7

I’m having issues with packaging of my game, everything works great in Standalone but when trying to load a level in the packaged game, the game crashes with:

Assertion failed: GD3D11RHI->GetQueryData

The same thing happens on a different PC.

So I tried switching to Vulcan (hoping for some performance improvements as well as stability) and I got the error in the title:

LogVulkanRHI: Warning: Failed to allocate Device Memory, Requested=262144.00Kb MemTypeIndex=7

This is on a decent gaming PC (NVIDIA GTX1060), Windows 10, with latest graphics drivers, after restart and running with administrator privileges.

Googling these errors didn’t help, haven’t found anything actionable. I’m getting a bit desperate, not sure how am I supposed to debug it to continue, months into my personal project. Thanks for any advice.

One more thing I guess, the error message in CrashContent.runtime-xml is

Unhandled Exception: EXCEPTION_ACCESS_VIOLATION reading address 0x00000000

Managed to run it with a visual studio debugger, got this call stack:

> Alpha-Win64-DebugGame.exe!VulkanRHI::FOldResourceHeap::AllocateResource(enum VulkanRHI::FOldResourceHeap::EType,unsigned int,unsigned int,bool,bool,char const *,unsigned int) C++
Alpha-Win64-DebugGame.exe!VulkanRHI::FResourceHeapManager::AllocateImageMemory(struct VkMemoryRequirements const &,unsigned int,char const *,unsigned int) C++
Alpha-Win64-DebugGame.exe!FVulkanSurface::FVulkanSurface(class FVulkanDevice &,enum VkImageViewType,enum EPixelFormat,unsigned int,unsigned int,unsigned int,unsigned int,unsigned int,unsigned int,unsigned int,struct FRHIResourceCreateInfo const &) C++
Alpha-Win64-DebugGame.exe!FVulkanTextureBase::FVulkanTextureBase(class FVulkanDevice &,enum VkImageViewType,enum EPixelFormat,unsigned int,unsigned int,unsigned int,unsigned int,unsigned int,unsigned int,unsigned int,struct FRHIResourceCreateInfo const &) C++
Alpha-Win64-DebugGame.exe!FVulkanDynamicRHI::RHICreateTexture3D_RenderThread(class FRHICommandListImmediate &,unsigned int,unsigned int,unsigned int,unsigned char,unsigned int,unsigned int,struct FRHIResourceCreateInfo &) C++
Alpha-Win64-DebugGame.exe!FRenderTargetPool::FindFreeElement(class FRHICommandList &,struct FPooledRenderTargetDesc const &,class TRefCountPtr<struct IPooledRenderTarget> &,wchar_t const *,bool,enum ERenderTargetTransience,bool) C++
Alpha-Win64-DebugGame.exe!TSparseArray<class TSetElement<struct TTuple<class FHeightfieldComponentTextures,class TArray<class FHeightfieldComponentDescription,class TSizedDefaultAllocator<32> > > >,class TSparseArrayAllocator<class TSizedDefaultAllocator<32>,class FDefaultBitArrayAllocator> >::AddUninitialized(void) C++
Alpha-Win64-DebugGame.exe!FHeightfieldLightingViewInfo::CompositeHeightfieldsIntoGlobalDistanceField(class FRHICommandList &,class FScene const *,class FViewInfo const &,float,class FGlobalDistanceFieldInfo const &,class FGlobalDistanceFieldClipmap const &,int,class FVolumeUpdateRegion const &) C++
Alpha-Win64-DebugGame.exe!UpdateGlobalDistanceFieldVolume(class FRHICommandListImmediate &,class FViewInfo &,class FScene const *,float,class FGlobalDistanceFieldInfo &) C++
Alpha-Win64-DebugGame.exe!FDeferredShadingSceneRenderer::PrepareDistanceFieldScene(class FRHICommandListImmediate &,bool) C++
Alpha-Win64-DebugGame.exe!FDeferredShadingSceneRenderer::Render(class FRHICommandListImmediate &) C++
Alpha-Win64-DebugGame.exe!UpdateReflectionSceneData(class FScene *) C++
Alpha-Win64-DebugGame.exe!UpdateReflectionSceneData(class FScene *) C++
Alpha-Win64-DebugGame.exe!FViewUniformShaderParameters:Operator=(class FViewUniformShaderParameters const &) C++
Alpha-Win64-DebugGame.exe!TBaseStaticDelegateInstance<void ,unsigned int &>::ExecuteIfSafe(wchar_t const *,class IConsoleObject *) C++
Alpha-Win64-DebugGame.exe!FNamedTaskThread::ProcessTasksNamedThread(int,bool) C++
Alpha-Win64-DebugGame.exe!FNamedTaskThread::ProcessTasksUntilQuit(int) C++
Alpha-Win64-DebugGame.exe!RenderingThreadMain(class FEvent *) C++
Alpha-Win64-DebugGame.exe!FRenderingThread::Run(void) C++
Alpha-Win64-DebugGame.exe!FRunnableThreadWin::Run(void) C++
Alpha-Win64-DebugGame.exe!FRunnableThreadWin::GuardedRun(void) C++
kernel32.dll!00007ffbb38d7034() Unknown
ntdll.dll!00007ffbb557cec1() Unknown


I have a feeling the map loading (the spike at the end) might have consumed all video card memory (which is 3GB). Or is is RAM? Are there no protections against that?

Seems like my polycount was too high, closing

hey can you say how you did that please im facing a similar issue thanks

@NinaCarla Basically I spent some time learning how to debug performance. Found out my water plugin was generating millions of polygons. :slight_smile: No wonder it was running out of memory.

sorry I’m new to unreal could you tell me how exactly you debugged it? And did you just disable the plugin ?

@NinaCarla That would be a really long post, try googling tutorials for performance debugging.
For starters these are the most useful console commands, give them a try one by one:

stat game
stat rhi
stat GPU
stat SceneRendering

Maybe you’ll see something out of the ordinary. The memory can be seen in your task manager (windows) / activity monitor (mac). Or you can try the Visual Studio tools but that took me a long time to figure out and now I’m not even sure I would be able to replicate it again myself. :slight_smile:
I’ve removed the plugin and decided to try the new official water feature of 4.26.

thanks ill try that, you at least seem to have pointed me in the correct direction