Spent the whole day today trying to backup my project using github only to realize it was too big for that. What are some alternatives? So far, the easiest way seems to be to just copy and paste the project folder but that seems very messy and archaic. Surely, there must be a better way?
Buy and external HDD?
I save all the project folders on my second(slave) hdd.
I set up an SVN Repository on a second drive. That way if the working drive goes down I have the SVN repo still, or if the Repo drive goes down I have my working copy (but I lose the old versions of the software).
At work we have a dedicated server with our Perforce depots stored on it, and we have local copies of the software too. Similar situation to what I do at home.
I make full copies of my project folder in regular intervals, about every hour.
With my current project, I have all 388 previous versions still available.
The whole thing uses around 210 GB at the moment. Should I eventually need space for something else, I could brobably delete some and only keep the last 50 or so…
In all, it saved my butt 4 times already, when versions 198, 207, 301 and 307 had unreadable map files after a crash
Copying the files is fast and yes: there is a lot of duplication, but at least I dont have to worry about repositories, forks, which aset is from which version, etc…
Restoring just means delete your corrupted project folders and copy the last backup 1:1.
I use my own personal GitLab server. Has saved my bacon a couple of times. 8-}
I mostly backup my stuff on a snv server or I just copy it on a hdd (the fastest way)