2

In a scenario when git is used as a backup tool from let's say daily cron commit/push to remote, is it possible to force git to keep only the latest 30 commits (both in local and remote) that is, to permanently remove (both in local and remote) all commits older than the latest 30 ones (or older than a certain date)?

elixenide
  • 44,308
  • 16
  • 74
  • 100
Antonello
  • 6,092
  • 3
  • 31
  • 56

2 Answers2

2

What you are after sounds a lot like shallow clones. For example, in your case you would want to use git clone --depth 30 ... to clone the remote repository.

You cannot clone from a shallow repository though, so the remote repository would still need to maintain the complete repository history.

If you don't want to maintain completely history in at least one location then Git isn't the right tool for the job.

cdhowie
  • 158,093
  • 24
  • 286
  • 300
  • I think you are right. I found rsync to be much suited for this task. It hands incremental backups using hardlinks, that is a very clean solution as the incremental backups appear as full backups. Here a link to a python script that I wrote to performs the "rolling latest n backups" task using rsync: http://lobianco.org/antonello/personal:portfolio:rsync-python-script – Antonello Jul 26 '14 at 18:27
2

You could try a solution similar to "How do I remove the old history from a git repository?"

#!/bin/bash
git checkout --orphan temp $1
git commit -m "Truncated history"
git rebase --onto temp $1 master
git branch -D temp

# The following 2 commands are optional - they keep your git repo in good shape.
git prune --progress # delete all the objects w/o references
git gc --aggressive # aggressively collect garbage; may take a lot of time on large repos

And then force pushing that branch.

Community
  • 1
  • 1
VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250