Memory stomp allocator

I just wrote a blog post about implementing an allocator for Unreal Engine 4 to find memory stomps. This was also sent as a pull request to Epic just like all the other changes I wrote about in my blog.

Hope you will find it useful.

This has been extremely useful in other engines I’ve worked with. I strongly encourage Epic to integrate this feature into the engine. Thanks PZurita!

Epic mentioned that they will integrate it :slight_smile:

https://github.com/EpicGames/UnrealEngine/pull/1331

Great! This would be very useful.

Quick update, this hasn’t been integrated yet so it won’t make it at least until 4.11

Now that 4.12 is available you can use the stomp allocator with the USE_MALLOC_STOMP define.

1 Like

That’s great! Could you elaborate a little further on how to properly use it?
Using a


Definitions.Add("USE_MALLOC_STOMP=1");

in the build.cs file of a specific’s game project is not enough, as far as I understand.
I am currently attempting a full engine recompile with a


#define USE_MALLOC_STOMP 1

added at line 82 of the file


\UnrealEngine-4.12.1-release\Engine\Source\Runtime\Core\Public\Windows\WIndowsPlatform.h

Is there any better way of doing this?
Maybe doing something like this only in a specific file the UE4Game.Target.cs ?

Update: tried the recompile with the added #define, the editor compiles fine but crashes during the splash screen: when loading is at 94%, it asserts when it check()s a pointer.

Looks like it activates just by adding #define USE_MALLOC_STOMP 1 to windowsplatform.h

Now I just need someone to tell me that a stall monitor thread exists :slight_smile: [This being a thread that gives the main update Xms to update and dumps the main thread’s callstack if it fails. It’s remarkable how often that callstack gives you the stall perpetrator]

I can’t seem to get this to activate. It seems like it should be straightforward, but…

I’ve tried #define USE_MALLOC_STOMP 1 in WIndowsPlatform.h and also in MallocStomp.h (not both at the same time; I did a test with it in one place, then a test with it in the other place).

To verify, I went into WindowsPlatformMemory.cpp and edited FWindowsPlatformMemory::Init() to include:


#if USE_MALLOC_STOMP
    UE_LOG(LogMemory, Log, TEXT("USE_MALLOC_STOMP is enabled"));
#endif

This log message never appears. The other log messages in that function do appear.

I’m at a bit of a loss for why this would be the case. What might I be missing?

Huh… this does seem to work fine if I run my project against a SOURCE build of the engine, but if I make an INSTALLED build of the engine (from the exact same engine source) the USE_MALLOC_STOMP seems to get undefined or otherwise disappear…

How does this actually work - where are the logs saved?

Since the allocations are each locked to a page, if you were to stomp memory - the game will assert and you can see where the stomp occurs. It doesn’t actually write out a report or logs. So you’ll see potentially two things happen while its enabled: 1.) Memory use will skyrocketed since all allocations now get locked to at minimum 4KB, and 2.) If you do in fact have a stomp - you’ll get an assert at that location.

Just a word of advice, I was profiling what I thought to be a memory stomp - but instead was a shared header with #ifdef’s getting compiled at different points where the #ifdef was enabled/disabled (which changed the size of the class since some members where in those #ifdef blocks). But this tool was invaluable for helping me cross “memory stomp” from the list.

I found in the engine source code that it does not support using on mobile phones. Is there any way to make it work on mobile phones or can it be used on a mobile phone?