1

My code is compiled by multiple people (multiple machines) multiple times per day. I would like to have each compile automatically uploaded to our Github account each time someone completes a compile. Basically, these compiled zips get sent to actual hardware via either flash drive or email or dropbox (any number of ways based on many conditions). Since the zip is always named the same, sometimes the old version is deleted on the device, sometimes stored in an /old directory. I would like to stop losing old versions and retain a central repository of each version stored chronologically. Github seems the perfect place for that.

I could of course ask each user to upload the finished zip that they created to a central location, but I would like for it to be an automatic process if possible. So - does Github offer a feature like that?

Davek804
  • 2,804
  • 4
  • 26
  • 55

3 Answers3

4

Github seems the perfect place for that.

Not really, since putting large binaries in a distributed repo (ie, a repo which is cloned around in its entirety) is not a good idea.

To get a specific version of your binary, you would have to clone your "artifacts" repo from GitHub, before being able to select the right one to deploy.
And with the multiple deliveries, that repo would get bigger and bigger, making the clone longer.
However, if you have only one place to deploy, it is true that a git fetch would only get the new artifacts (incremental update).

But:

  • GitHub doesn't offer an unlimited space (and again, that repo would grow rapidly with all the deliveries)
  • cleaning a repo (ie deleting old versions of a binaries you might not need) is hard.

So again, using GitHub or any other DVCS (Distributed VCS) for delivery purpose doesn't seem adequate.

If you can setup a public artifact repository like Nexus, then you will be able to deliver as many binaries you want, and you will be able to clean them (delete them) easily.

Community
  • 1
  • 1
VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250
3

Github has the concept of a files attached to a repository (but not actually in the repo - they're stored on s3) and there's an api for uploading files to it.

You could call that api as part of your build process.

If you have a continuous integration server building your code after every commit you should be able to get it to store the build products somewhere, but you might have to handle the integration yourself if you want them stored on GitHub (as opposed to on the CI server's hard disk)

Frederick Cheung
  • 83,189
  • 8
  • 152
  • 174
2

While github is perfect for collaborating on the sources, it is not for managing the build and its artifacts. You may eventually want to look at companies like Cloudbees, which provide hosted build and integration environments, that target exactly the workflow parts beyond the source management. But those are mostly targeted towards Java development, which may or may not fit your needs.

Besides that, if you really only want to have a lot of time stamped zip files from your builds accessible by a lot of people, wouldn't a good old fashioned FTP server be enough for your needs, maybe?

Bananeweizen
  • 21,797
  • 8
  • 68
  • 88
  • I think an FTP server is probably a reasonable bet. I just wish I didn't have to ask my developers to format a filename in a certain way and always remember to upload their compiles into an FTP. Life's never perfect though, so maybe they'll have to do that and I'll have to settle for the non-centralized source (non-github, that is). – Davek804 Jul 04 '12 at 03:46