12

We're a team thinking of making a game using the Unreal Development Kit and we're looking around for version control solutions.

I have always preferred decentralized VCSs like Git and Mercurial, and used it for all my personal projects. Though I have heard of problems regarding game development using these systems, with them not being suitable for big binary files.

Subversion seems like a good solution, however I have not used it at all in the past, so I don't really know what it offers.

  • should this be moved to game-dev to get a more representative response? – prusswan Nov 27 '11 at 13:27
  • 3
    I'd recommend separating large binaries from the source. – Michael Krelin - hacker Nov 27 '11 at 13:36
  • 1
    @Michael At first it sounded like a good solution. Then I realised that there are a lot of connections between the binaries and the source. That would potentially make them harder to sync with each other. –  Nov 27 '11 at 13:38
  • I think *most* of the time (like percentage of commits) it's not an issue. I also think that git submodules work pretty well once all the tricks particular to your devs and dev process are properly employed. The usual thing of learning curve, in short. – Michael Krelin - hacker Nov 27 '11 at 13:42

5 Answers5

7

You can use Mercurial or Git without a problem. I don't know why @TomTom states otherwise in such a way without providing any hints about the reasons.

Sure you will have to manage the various binary files that comes with every game development (movies, texture, maps, etc). But both software have ways to handle them.

Mercurial

Mercurial has various extensions to deal with large files, especially binary ones.

Since version 2.0, the Large File extension is included in Mercurial and you can use it without downloading anything else. Files are not directly stored in the repository, but Mercurial tracks which version of the file is needed for each changeset, so you can download a precise version of your game with the right file when you need them.

Before 2.0, there was mainly two extension used :

  1. Bfiles extension, which was used as the base for the new extension in 2.0
  2. Bigfile extension, which is a little more complicated than the two others in my point of view

Git

I'm not really familiar with Git possibilities to handle large files, but I know various solution exists. This post should give some good pointers : Managing large binary files with git

I've hear a lot of good about git-annex.

SVN

I've used SVN with large files, and I don't have the impression that it manages them better than Mercurial (without extensions) out of the box.

If you decide to use an extension to manage large files with Mercurial / Git, Subversion is way behind in term of functionnality.

Conclusion

About your concern to keep the binary files version in sync with the source, bot Mercurial and Git extensions handle this very well.

Each version of the binary files is stored in a separate centralized directory (the largefile store in Mercurial terms) and when a dev clone the repository or update to a particular changeset, only the needed files are download from this store.

You can even use the same store for various repository !

The process of adding a file to the store or updating it is nearly transparent to the dev, so no problem about a long learning curve for example.

To conclude, I really don't see why Git or Mercurial are not up to the task and no other answer gives any real reason.

In my opinion, big game companies sticks with older tools because change takes time and when you've paid for something, you use it ! How many companies out there are still using VSS or CVS just because the hierarchy won't hear anything about something else ? We just migrated to TFS (from VSS) last month where I'm working...

Community
  • 1
  • 1
krtek
  • 26,334
  • 5
  • 56
  • 84
  • With Mercurial or Git I would create a separate repository for your art assets and include it as a subrepository. Or make them entirely separate and let a build process bundle them into binaries that are dynamically downloaded when compiling and packaging your game. Then probably you won’t even need to use the large files extension. – Laurens Holst Nov 28 '11 at 10:52
1

I'd stick with Git. What you may lose with the storage of large binary files, you'll more than gain in being able to work together as a team. You'll be updating, changing, merging your code a lot more than just storing binary files, and DVCSs handle this a lot better.

Abizern
  • 146,289
  • 39
  • 203
  • 257
  • I have heard about memory problems with tracking such large files though, won't that be a huge problem for us? –  Nov 27 '11 at 13:21
1

Hq(Mercurial) 2.0 support of big files is great.

Previously it was LargefilesExtension, but from the version 2.0 this is distributed with core.

Dima Pasko
  • 1,170
  • 12
  • 20
1

I have always preferred decentralized VCSs like Git and Mercurial,

TOTALLY unusable. Point.

The trick here is that you can not have one distributed repository the moment your repository is significant and this significance (data amount) is possibly irrelevant for most people.

If you do game development, you will have tons of version controlled assets that programmers have ZERO interest in. Even most graphics people wont care. DCS break apart the moment you can expect your repository to be larger than a typical hard disc. And depending how good you are graphics wise you can easily hit thousand+gb opf repository. The concept of having to merge that into my local repository as a programmer not dealing with the grpahics except small test graphics makes me kringe - as does it drive the network admin mad.

This gets even worse when you properly deal with animations / movies on top. Every change and render is another version. Boom. This is bad enough to handle in a central repository (where I can add discs to a central server), it is totally unusable the moment you distribute that to every team member IN EVERY VERSION.

Subversion seems like a good solution,

Possibly, if you can deal with it#s stupidities.

I would strongly recommend a prooven solution here - perforce is pretty much able to deal with it prooven.

TomTom
  • 61,059
  • 10
  • 88
  • 148
  • 5
    Nothing prevents you from having multiple separate Git repositories for each separated part… – poke Nov 27 '11 at 13:29
  • Sure, Perforce looks good. The only problem with it is the costs of it. I wouldn't really say this game is something super-serious, so we're not spending loads of money on it. Then our only real option would be SVN, am I right? –  Nov 27 '11 at 13:33
  • Hm, no. There should be alterantives out there. A lot depends on your platform etc. Personally I sould go and sign up for MS BizSpark and et everything you need for 3 years for 100 USD ;) But then... yeah, SVN looks quite good. Even then you will possibly run into size issues on the server. – TomTom Nov 27 '11 at 14:03
  • @TomTom why Git and/or Mercurial are "TOTALLY unusable. Point." There's a lot of way for both of them to handle large files without a glitch and aside from this particular point I really don't see the problem. Both of them provides extension to manage large binary files and let the dev download them or not. – krtek Nov 27 '11 at 23:58
  • @Gustav I wouldn't say perforce's *only* problem is price. I'd also say it's over complex and tends to get in the way a lot more. That said it is pretty quick with large repos. Personally I'd be in the Mercurial + LargeFiles camp but I'm aware it's a recent addition. – Paul S Nov 28 '11 at 01:37
0

Big game studios generally use http://www.perforce.com/ (to be honest, I have no idea why.)

SVN is a pretty good option as well, but it's a bit outdated compared to Git and Mercurial.

Anyway, why would you put large binary files on the repository? You'll definitely want to add ignore lines for all compiled binary files, etc. Always make sure to only include source files :-)

Personally I'd stick with Mercurial or Git.

Tom van der Woerdt
  • 29,532
  • 7
  • 72
  • 105
  • 1
    The thing is, most content in a UDK game is binary, for example all map data. –  Nov 27 '11 at 13:20
  • If you leave the binary files out of the repository you lose the ability to be able to check out a complete project. If you change some graphics between versions, and then check out a previous version you've then lost all connection between the graphics and the code unless they are in the same repository. – Abizern Nov 27 '11 at 13:21
  • Ah, yes, true, but those files aren't that big. You will definitely notice that a git/mercurial repo grows big (SVN wouldn't) but that should be the only problem you'll face. – Tom van der Woerdt Nov 27 '11 at 13:21
  • @Abizern: I only meant the compiled binary files. Source files should, of course, always be in the repository - that includes the textures. – Tom van der Woerdt Nov 27 '11 at 13:22
  • Still. I know teams even putting the compilers into the repository. Film renderings etc. must go in. All source data (textures etc.). Dont check them in, you loose significant. – TomTom Nov 27 '11 at 13:24
  • ... again, source data must always be in the repo or you lose data. You just don't want to include the compiled versions such as the massive installer UDK creates. – Tom van der Woerdt Nov 27 '11 at 13:26
  • 2
    Both Mercurial and Git have extensions to manage large binary files and keep them in sync with the source code. These files are however not stored directly in the repository to avoid the growth problem. See my answer for more details – krtek Nov 28 '11 at 00:01