0

I have a 6Gb GIT repo that's taken me a while to piece together using a combination of both local and production files. I've added/committed/pushed a bunch of images (1000s of them) into the repo from production but I have the majority of them on my local machine already (from a previous backup) in the same directory locations as the ones from production but they're 'untracked'

I get the feeling if I do a GIT pull on my dev environment it's not going to work, because of the untracked files - but I'd like to try and avoid deleting them all and then downloading them all from the GitHub repo - is there a way to do this?

I haven't actually tried to do a pull for fear of having a massive merge problem - I'd prefer to just delete them all if it's going to be hard work and live with downloading them all again.

Rob
  • 10,004
  • 5
  • 61
  • 91
  • 1
    Make a backup of the dev directory that you want to pull down into, just in case something explodes. Git is pretty good about telling you if there is going to be a conflict and will abort a pull (fetch->merge) if you have changes that could be overwritten. I would add/commit all those untracked files, then pull down. `git add -A` `git commit -m "major merge"` `git pull remote repo` I would also avoid a rebase because you will want that big of a merge to be easily spotted in the timeline. – Matthew Blancarte Jun 11 '14 at 06:17
  • You could also stash those files. `git add -A` `git stash` `git pull remote repo` Then if you wanted to push them back onto the timeline, just pop them. `git stash pop` `git commit -m "added stashed files"` – Matthew Blancarte Jun 11 '14 at 06:23
  • I'm trying to avoid downloading the thousands of files I already have - it would seem I can't do it - the git pull is enormous so I'm just going to delete the unmatched files and save myself the trouble because it looks like they are going to get downloaded whether they exist or not – Rob Jun 11 '14 at 06:24
  • Ah, gotcha. Best of luck! – Matthew Blancarte Jun 11 '14 at 06:27
  • That seems to big for a single Git repo to manage efficiently (see the Git limits at http://stackoverflow.com/a/19494211/6309). Did you consider splitting into several git repos, and using submodules? – VonC Jun 11 '14 at 06:52
  • @VonC The post you've referenced appears to relate to Large Files - not a Large Repository - most of the files in the repo I'm referring to are aren't more than a 500k. – Rob Jun 11 '14 at 10:25
  • @Rob it relates to both large files and large repo, as in large number of files. – VonC Jun 11 '14 at 10:44
  • @VonC - ok sorry I missed the second part but it's not an issue - the GIT repo is running perfectly with about 90k files - I was just trying to save some bandwidth - the code base has been deprecated and is a mess (hence being deprecated) so sub-moduling it would just be a pain with no gain at this late stage in its existence – Rob Jun 11 '14 at 14:29

0 Answers0