Can anyone tell me the best wat to use version control in ue4

It will be helpful if some one can give me a solution for vesrsion control and the way you guys use it

My answer is I don’t.

Completely ignore the in engine’s stuff.

If I use anything at all, I do everything via VisualStudio.
It supports both team foundation If you pay for those services, and Git when you do not.

And git has gone free to use again with up to 7 private repository or something the like. So it’s probably good enough (if you trust their security).

If you don’t (trust outside security), then just buy a Linux box and install one of many SVN… even just Rsync (works well locally on windows via CygWin if you remember to run it).

1 Like

I have to massively disagree with the above. The first thing I do once I decide to work on a project for longer than a weekend is setting up version control. I use git, mostly because that’s what I’ve grown up with. Most of the time, especially if it’s just you, you’ll use like 5 commands: add, commit, fetch, pull, push. Anything else you can find concise and good tutorials in a heartbeat on any of the Stack* sites.

The only thing git sucks at is diffing binary files. I also think GitHub’s security is sophisticated enough, especially since you’d have to roll your own if you host git on your own servers. It’s just one less thing to worry about.

In short: Use version control. Start with git until you hit roadblocks that you cannot work around anymore. Then find (and probably pay for) something else that better fits your requirements.

EDIT: You can even get by without LFS for a very long time, depending on the type of game you’re making.

5 Likes

We use self hosted perforce version control. It’s very fast, works well with binary files, works well with different streams, properly handles locking for exclusive checkout of binary files, etc…

The editor plugin is very useful for automatically checking out files as you edit them and submit directly from the engine. Don’t use the in engine sync, it crashes 99% of the time.

If you can deal with 5 or less unique logins (you can create up to 20 workspaces) then using perforce is free except for the server you run it on. Installation is mostly trivial when following their guide, given some prior experience managing a server.

2 Likes

i haven’t used source control to work with other team members, I only use it to keep a local repository of my project in case I screw something up I can quickly go back to the latest version. i use git for this purpose and the great thing about source control in unreal is that u don’t need to use the terminal to make changes to the repo, unreal has some built in tools to commit or revert back the files u were working on which is great and is more visual than coding.

1 Like

I am an art-focused, solo developer. So when it comes to technical things, I want simplicity and reliability.

To that end, I’ve found Github desktop with Large File Storage (LFS) is the best option for me.

2 Likes

Those of you using Github LFS are you storing lots of source art with it? I’ve been looking for a way to version control all of my art assets for a while and when I was trying to price out github’s lfs it looked like it was going to cost me a small fortune just to manage my zbrush sculpts. They can be up to 1gb per save and I have a ■■■■ load of them… (nearly 4tb and growing…)

1 Like

I use it to store art but my entire game project is only a few GB’s, including source art.

You could store source art probably with just manual backups. I don’t think you can store 4tb’s without spending some money, somewhere though, whether it’s a cloud or buying more physical storage space. Github LFS is as economical as I’ve seen - most services look to be around same price range.

1 Like

That’s what I’ve been doing for years and I hate it, which is why I was hoping to find some alternative. :frowning:

Most of the size is dedicated to just storing different versions of the same files, I was hoping to find something that could diff/compress them so that it wouldn’t actually need to be 4tb of (mostly) redundant data.

1 Like

At least as far git is concerned, you will not save any disk space when using it compared to plain old cloud storage. The reason is that git has to store each revision of a file in its history in order to be able to check out the associated commit. That means your git history and with it the size of the .git folder and the space used on the server will grow everytime you commit a change to a file.

LFS might work a little differently, but file size savings should not be the primary motivation for using version control systems, especially for binary assets. Cloud storage with some sort of file versioning system that’s capped by number of versions or date would be better suited for that.

1 Like

Guess I should just consider setting up a git server on a NAS, then I can just shove as many drives in it as I need :confused:

1 Like

Beyond Compare 2 might be worth looking into.

1 Like

I wouldn’t use a WD “nas”, I’d just install ubuntu on an old PC and set up a git server on it. Regardless my concern is not backing up data, it’s doing version control on large amounts of data at reasonable cost.

1 Like

first of.
Instead of running a full pc for the Nas, something like a raspberry pi would be much lower profile/lower current and far more reliable ?
Those things are really hard to damage over time. Even with pet hair in the fans.

Second of.
It has to do with how you work.

When I model/sculpt I just throw everything I made into a directory for 3d assets.
I also don’t usually produce textures, but if there are any they go in there as well.

Updating the model usually needs no second or third file unless I need to make a variant.

Zbrush being the expection because it damages the mesh randomly, so you so need 25 backups of the same file… the clean up for that can only be manual - as not every file is even going to work after being saved. (Really why is zbrush so shi tty with file corruption?)

If I were to version control the folder, I’d do it manually via CigWin / rsync with a delete old files option flag.

Also, I should note my case…
STORAGE is a hard disk.
Working partitions are always SSD.
Final storage does get pushed to a different drive.

I either do it manually or use rsync commands. Depends. Doesn’t really need versioning in my case.

Also, final FBX exports are way lighter. If it’s something you don’t need to ever go back to.
If it’s not, then zbrush files are the largest… blender barely reaches .1Gb when you PUSH it (unless you include 8k images saved within the file?)

1 Like

I have used GIT and PERFORCE. GIT will only git you so far (ba dum dum). Perforce UE4 support is much better and an Epic developer told me that is what they use to manage Fortnight. There is a free version for small teams. However, Perforce is not simple to manage. If you move or rename directories in the project it will corrupt the server branch. The merge is not as user friendly as git. Binary merges are not possible for obvious reasons. When integrating with Visual Studio you need to set several files to ignore. There are technical articles on how to do this. If you don’t your project will be corrupted on the next sync. You will find that you are frequently re-creating your base repo because of sync issues with other members.

So, for just yourself, I recommend making a zip file periodically.

If you have less than 5 people and not a huge project, use Git.

Five or more people and/or lots of large assets, use Perforce, but be prepared to spend alot of time managing the repo.

Lack of proper enterprise grade source control in Unreal is, in my opinion, one of the largest weaknesses of the toolchain.

4 Likes