Need help understanding UE4's Autosave and Backup system


recently, after noticing a rapid daily decrease of free space on my project hard drive, I’ve learned that UE4 by default employs Autosave and Backup system. These are manifested as Saved/Autosave and Saved/Backup folders inside project folder. I am a bit confused about how exactly these work, as there appears to be only limited control over them and yet, if you change a lot of content in the game in a very short amount of time, it can literally generate gigabytes of data a day.

I’ve found out I can disable Autosave feature, which I should not do if I want to be able to recover from crashes without much lost work. However, its the Backup folder I am more concerned about. Here are my questions:

1, I am noticing that both Autosave and Backup folders save multiple states of the same asset. I’d always expect Autosave to store only the latest one while Backup should store all the iterations. Why is there overlap between the two? Why does Autosave appear to partially do the same thing as Backup, storing multiple versions, not just the latest one?

2, Is there any way to disable Backup, not Autosave?

3, What exactly triggers Backup to store a version of the asset? Is it any change? Or is there some timer on top of that?

4, Is Backup ever automatically cleaned? Or do you have to clean it up manually otherwise it will keep going until the drive is full?

5, There appears to be some parse-friendly time stamping of the filenames in Backup folders, which implies some built in system to recover older versions. Does UE4 have some built in versioning tool where you can revert to older version? Or do you have to manually reimport the backed up asset from the Backup folder by hand?

If you say so… But I get more crashes from UE4 running out of disk space than anything else. There’s no exception handler to cover this case and forewarn users… Why is there a run on disk space? For many of the reasons you outlined. Projects and Vault take up heaps of space and UE4 likes to devour even more when working in the Editor. For example DDC etc. Every so often shaders go into full compile mode and eat even more… So overall I’d argue that its better to disable Autosave, and instead ‘save early and often’. Autosave timing is way off too… Its like Microsoft and updates. When you have a deadline and time is a factor that’s when Autosave will kick in and save the level or map that had accidental changes that didn’t need saving versus code! Overall, it just gets in the way…

What’s worse, Autosave files can’t be used directly for recovery unlike Backup/Saved. I’m guessing Epic must use a different format… That’s why looking to disable Backup but not Autosave is misguided… There’s no timer on Backup… It doesn’t do version checks… A Backup gets added anytime the user clicks Save… You can purge Backup by Migrating the project manually. Or just duplicate the entire project folder but exclude / delete that specific folder… Backup is NOT Autosave, you have to restore the files manually…

So the conclusion is to… turn off Autosave, keep using Backup and clean Backup manually if needed? It feels a bit archaic to manage any data by having to navigate to a specific folder, sort the files by date and manually pick ones to delete :expressionless:

Yes Backup is key - Autosave is optional… Sorting / Delete?.. You’re crossing over into source control territory now… If you’re stuck in a daily grind / boring manual process, you may need a structured solution to help. Meantime Rar/Zip could help to recover disk space… Keep an eye on the Vault also, because it stores duplicates of everything. Whereas it may just be better to download stuff as you need it. Same goes for DDC. Check local and shared copies and purge them from time to time too, but leave time for the shaders to recompile afterwards…

I am getting confused by half of the terms you are saying to be honest :slight_smile:

First of all. What I meant is what’s the recommended action to take when Backup folder size has gotten out of hand? In my example, my Content directory has about 200 Megabytes while Backup directory grew to about 5 Gigabytes. So it’s not unreasonable to assume if I worked on roughly 2GB worth of content, I could end up with 50GB of backups.

Second of all, I know there is some sort of Vault cache, and I know it takes space, but I have no idea what’s its purpose, and if UE4 clears it automatically on its own, or if you have to manually keep cleaning that up. And also what’s its purpose, given that Autosave already takes care of Autosaving… ?


If you’re running short on disk space there are 3 or 4 core areas to look at (see below). Lots of devs buying 128 SSD (or 256 SSD and filling it with games), don’t realize that after the drive is formatted and Windows is installed, the amount of free space is minimal. So you need a regular hard drive that’s much larger. Often you can get away with just an external drive or flash memory stick though (if you don’t have a second drive built-in).

Overall, in your shoes… I’d buy a large cheap budget external drive and a USB memory stick, and copy out the entire project folder to both regularly (once a day minimum). Then purge the Backup folder off your main hard drive. The reason I wouldn’t just purge / delete the Backup folder is because of risk of corruption. A part of your game you haven’t tested for a while or don’t test everyday could have a corrupt blueprint or other asset etc. Use the memory stick to keep another backup copy of the entire project, which you can bring with you on the go just in case. But note, none of this is considered good practice as most devs use source control…

4 core areas to look at:

  1. The project itself. Make sure to use the ‘Migrate’ option regularly to shrink the project out to a new one. It will also help alert you to any corruption.

  2. The Vault. This has nothing to do with your current project. This is a Content Management system for managing free and paid-for assets off the marketplace. It eats disk and isn’t very good at managing assets. The Vault system is particularly inefficient, as what it does is keep a permanent copy of assets you can add-to-projects later. But actually this system isn’t smart. For starters, it doesn’t help you manage plugins and different engine versions locally, which is a huge imitation. So what most devs do, is copy assets to their own master projects and them manage the structure that way outside the Vault system. Devs take back control by renaming assets and folders and organizing things in a way that’s more efficient for them. For example there’s no sub-folder or sub-grouping in the Vault, so that’s a let down. The Vault also has loads of other issues. Until recently there was a limit to how many assets you could have before you couldn’t access them properly or buy more (1000 assets iirc). Its been fixed or there’s a workaround for now. But it highlights some of the problems with the system. The real challenge though is its not easy to move the Vault out to an external drive. But you can read up on symbolic links / junctions if you want etc.

  3. DDC… There are multiple DerivedDataCache folders for compiled shaders etc (local to the project and shared). These can be safely deleted and will be re-created as needed. They eat space, so its is worth tracking them down (Windows file search etc)…

  4. Redundant Engine and Launcher versions. If you’ve been using the engine for a while, its likely you have gigs lost to both of these. Deleting the folders will recover lots of ‘dead’ disk space…

Hopefully you know the answer to this now, which is NO… Epic do not recover disk space for you, neither does Windows. But its probably better kept this way as even top corporations who consider themselves world leaders in tech ‘f’ things up regularly. Examples: 1 2… Basically, if you let software decide what to keep and what to delete, mistakes will happen. Potentially costing you your entire project or months of work… That’s why regular Backups to the Cloud and local / remote drives are critical. Remember Devs lose work on here weekly by not doing this…

Hey, thank you very much for taking the time for such an elaborate answer!

I’ve already read about most of the parts you mention, so I pretty much know what to manage/clean and when to do so. I guess it was just a bit unusual to get used to a concept of software which generates a lot of backup data with no control over it, gradually increasing data size without any safety precautions from filling up the hard drive.

I actually have 3 SSD drives + 1 HDD on my PC, one of the SSD being reserved for Unreal projects. It’s just 250GB, but only for the projects, nothing else. I was just thinking about it in terms of future scenarios, rather than something urgent.

I have also set up Git source control. Instead of backing it up to a GitHub or Bitbucket though, I’ve opted to push the repository to bare clone on my Google Drive instead, as 2TB for $9 a month is something Git hosting services can not top. Furthermore, I am already using GDrive to backup all of my other data too, so it’s nice to have it all in one place. I know it’s sub-optimal solution, but since I am working alone, not collaborating with anyone else, I think it should work.

So my final plan is to do Git commits relatively often, at least once every 2-3 days or so, have Git repository backed up on Gdrive, and then clean backups or migrate the projects as soon as disk space starts to be a real issue.

Thanks again for the help! :slight_smile: