Git Vs Perforce

Hi guys,

I need to share my project with collaborators. After gathering as much information as I could, I saw that the two main tools for that are Git and Perforce. Which one do you think is the best for working with a small team on an UE4 project?

The main problem I see with Git is about binary files - which are… necessary while making a videogame. I understand that you can use LFS, but it doesn’t seem a great way of dealing with the problem… am I wrong?

From what i saw, Perforce is free for up to 5 users… but how much does the server that hosts your project cost? Or is it included for free in the Helix package?

I have a 17GB project (by getting rid of saves and some currently unused assets I could get it to 10GB or so… but we don’t want to limit the growth for our game).

I’d be very glad to hear your thoughts.

And thank you everyone for the help.

2 Likes

Git LFS is just great.

Can you elaborate? I tried Git LFS and it turned into a complete hell. For example - I created a few large binary assets (imported a bunch of high resolutions assets to try in the project), made a commit, then removed them, and made another commit. It was about 4GB of data. Of course, once they were removed they were no longer needed. The way Git works is not an issue with source code, since file sizes are negligible. But in terms of binary assets, just a couple days of several artists working can easily generate couple of dozen gigabytes of data.

That’s where the big issue came. Git LFS has some prune command, which removes outdated LFS files, but they are based on a very convoluted set of confusing and borderline insane rules. So there was just no way to prune the old content. In less than a week we have reached limit of out paid Git repository storage with absolutely no way to prune old LFS assets to make a space for new ones.

The literally only way to solve that issue was to remove git completely, and reinitialize the new git repo from scratch. Of course that has lead to loss of all previous VC data, so it completely defeats the point of having version control system in the first place.

I just don’t understand LFS. To me, it seems like if you add any large binary file, at any point in time, you are stuck with it forever, even if it’s way outdated and you will never need it again. So if you are working on a long term game project, which is actually serious AAA production, and contains serious high end assets, not some gimpy lowpoly indie stuff, it’s easy to approach a repository of over a terabyte in size in just a couple of months, assuming that there will be lots of iteration on the art assets. And hosting for repos of that size is nowhere near affordable, nor makes any sense.

Thank you for your answer. What am I missing, what makes LFS great?

Thank you for sharing your experience. I haven’t tried LFS yet, but was thinking that such a cumbersome integration could cause problems like the ones you described.

Use bitbucket or gitlab if you want to prune LFS files. Eternal LFS it’s just a GitHub issue, and it’s definitely not a problem for AAA projects :wink: Because: 1. devops can prune files if necessary. 2. space is not the issue, you can use hosted bitbucket solution in case of terrabytes of data.

I actually hosted repo on my own server, yet I still had no control over it. There was some ridiculous rule where the files would not be pruned if they were not older than X days, and there was some override that was supposed to change that but it never worked. So I’d like to know some more precise details than just “Use Bitbucket”. Like some sort of guide how to deal with Unreal + Git LFS and substantial amount of large size assets. Because you are really the first UE4 user I have ever seen who called Git LFS great for use with UE4. So if you have it figure out, I’d really love to know. SVN is a bit outdated and perforce has obscure licensing policies. Git seems therefore the best option on a paper, just not in reality :frowning:

the ridiculous rule for cvs is ability to “prune assets”. control version system is all about integrity. Just buy more disk space, and “problem” solved :slight_smile:

Yes, so this is exactly why majority of people hate Git for games, and almost no one recommends it. Whenever you ask specific questions, such as having more control over the binary asset history and repo size, you never actually get a clear, detailed answer from someone who knows what they talk about. You rarely get any answer at all, and if you get one, its the “Just buy more disk space” kind of answer.

I am not even going to go into debunking that, because that answer just ridicules itself. But it also serves as a great answer for the OP’s Git vs Perforce question. Definitely Perforce :slight_smile:

Yeap, that’s the way - just select the thing that is really not the issue and build a critique around :wink: You want to cleanup space? Just do it. Gitlab and Bitbucket allows it even on githook levels.

For small dev team I’d take Tortoise SVN over Git or Perforce. :cool:

A bit late, but let me add to the response. I had the exact same issue as you do, wanting to delete a file from all history/commits on Git.
You can actually do it and so get your space back. It’s just a bit of a painful process when you don’t know how to go about it, but it is doable.

I think for a small team where the project is indie, and you optimise your assets, Git is probably great especially if you are used to it, Perforce isn’t free unfortunately, but you can run it on your own AWS EC2 instance with a free tier that cost less tan $10 a month.

1 Like

If I were starting something right now, I think what I’d consider exploring, would be to use Perforce for a depot for all your big binary stuffs, audio, models, source textures, etc, and then have a git repo with submodules for the actual game working tree. I don’t know necessarily that that would work well, but it’s something i’d explore.

Pretty much anything beats using p4 or plastic for source code. :expressionless:

Hey, I have two points to add about Git.

  1. many don’t know about it, but you can also host a Git repository on Azure Devops. There you have no limitations on Git LFS space, and it’s free for teams up to 5 people.

  2. regarding the local storage issue. We use Anchorpoint as our Git client. It supports sparse checkout, which allows you to check out only certain folders in your repository. For example, you can leave your Blender files in the repo, but not download them if you don’t need them.

If you want to delete old versions, you can do that by deleting a folder and checking it out again. Then it will download only the last version from the server and not the complete history. This is a little bit cumbersome but it works.

Here in the picture I joined the repository but I didn’t download any files. When I click on the cloud icon, it does a folder checkout and also downloads only the latest version.

3 Likes

Isn’t the whole point of version control to keep your history? If you are running the git LFS server you have to expect huge amounts of space to be used. Its only a problem if you have created for yourself by deciding to run your own git lfs server and then deciding you don’t want history. Yes it will require some manual work if you want to remove file history and is just part of running your own server.

Developers using your git lfs server wont have this problem. They only have the versions they worked on and can easily clear out the old ones. Disk space is not an issue for them.

So just wondering then does this mean perforce is not retaining a history of binary files? Wont you just run into the same problem if you have to connect it to AWS for storage?

When it comes to binary files, the history is often maintained outside of the content folder. Artists have for example many different versions of the source 3D or 2D files backed up, and can at will open any older version, make arbitrary changes to it and re-export it.

Yes, with git LFS you expect your repo to be big, but within reason. I mean even the Git LFS developers realized it’s an issue and that’s why the “prune” command exists:

So Git LFS by design assumes that sooner or later you will have to clean up outdated versions of your binary files to keep the repo size manageable.

RawalancheMarketplace Creator

Yes, with git LFS you expect your repo to be big, but within reason. I mean even the Git LFS developers realized it’s an issue and that’s why the “prune” command exists:

Wouldnt that same situation happen with perforce? (I’ve never used it forgive my ignorance) Does it not keep a local history of binary files so you can compare or switch versions?

Perforce is centralized which means you can´t do much without a connection to the server including comparing with older revisions. You only have one revision locally.

does that mean you have to download each time if you want to switch back n forth between branches using different versions of assets?

For small team, SVN is just good. Although some will say it is archaic, it gets the jobs done well. A lot of people are contented with just add,remove,update… that is all. I have been using SVN for more than 15 years - and for all types of projects it serves well till now.

If the team is geographically challenged, just subscribe to private svn cloud . Or you can just rent a VPS for $5 per month and host the svn server yourself.

1 Like

I couldn’t agree more. Once you’ve learned how to set it up it’s a breeze. I’ve been using it in production for over a year now, hosting the engine source code, dependencies, and my project all in one repository on the free Microsoft Azure DevOps unlimited LFS storage. My project is over 500GB and the builds are 100% reproducible after cloning the repository.

Here is an article and the script I’ve developed specifically for this.

3 Likes