4

I'm having issues getting Git LFS to track my large files properly (similar issue reported here: Git LFS refused to track my large files properly, until I did the following).

In my specific case, I am trying to push a directory composed of multiple subdirectories, each of which has files of a specific type that I would like to track. The extensions for these file types are .bed, .Bed, and .sorted. Here was the recipe that I followed:

I did git add ., then git commit -m "initial commit", then issued the respective tracking commands (e.g., git lfs track ".bed"), and then did git push origin master.

However, I still received multiple error commands ending in "this exceeds GitHub's file size limit of 100.00 MB". I've already gone through practically every available Stack Overflow post on this topic (e.g.., git lfs not working properly for files larger than 100MB), so any advice would be greatly appreciated.

Community
  • 1
  • 1
warship
  • 2,924
  • 6
  • 39
  • 65

3 Answers3

0

Based on these instructions, first you need to track, then you can add and commit. I followed a similar instruction for bitbucket and it worked.

When you add and commit before track, you are probably commiting in your repo instead of the lfs repo.

coelhudo
  • 4,710
  • 7
  • 38
  • 57
  • 1
    Did you remove the file from the tree before adding again? You can do this using git rm --cached *filename*. Then proceed with git add file.bed and git commit. – coelhudo Oct 22 '16 at 04:02
  • Well I followed the Github instructions perfectly, and then when it was time for `git push origin master`, I received: `Git LFS: (0 of 671 files, 100 skipped) 0 B / 8.88 GB, 2.84 GB skipped`, then followed by a `U` (whatever this means!), and then a repetitive line of `Git LFS: (0 of 671 files, 100 skipped) 0 B / 8.88 GB, 2.84 GB skipped`. And the program hangs up. Like nothing happens whatsoever, so I'm forced to do ctrl-c to break out. Note that I do successfully get output from `git lfs ls-files`, so I've narrowed my problem down to the last step. What could be wrong? – warship Oct 22 '16 at 08:58
  • I did some tests and it worked here, but only with files smaller than 100MB. I was not able to reproduce the file being skipped. However, the error that you described is related to 100MB limit. [Here](https://help.github.com/articles/working-with-large-files/) it says that 100MB is the hard limit and no push is allowed in such conditions. Have you tried to push smaller files? – coelhudo Oct 23 '16 at 05:19
  • 3
    But isn't that the point of Git LFS, i.e., to push files larger than 100 MB (via pointers)? – warship Oct 23 '16 at 18:01
0

I had exacly the same issue. I later realized that the large files were indeed uploaed via lfs just like showed "....skipped..." . But the large files still exist in the git commit history. How I solved it was to use git filter-branch --index-filter 'git rm -r --cached --ignore-unmatch <file/dir>' HEAD (note you need to replace by the filename of your large file) .This will filter your large file records in the history. After this, I could git push origin master, no more error.

Rafael
  • 1,761
  • 1
  • 14
  • 21
0

If you have to remove all cached files because you have quite a few large files then you can use:

git rm -r --cached .

Make sure you include that period to signal you want all files uncached.

cmcnphp
  • 400
  • 1
  • 4
  • 14