I have large files and was attempting to use the new Git LFS system.
I posted this question - Git lfs - "this exceeds GitHub's file size limit of 100.00 MB"
Edward Thomson correctly identified my issue - you cannot use LFS retroactively. He suggested I use BFG LFS support
This worked to a degree. The vast majority of my files were changed. However, there were protected commits that were not altered.
Of these protected commits some were over 100.00MB and thus caused a remote:error from github
Protected commits
-----------------
These are your protected commits, and so their contents will NOT be altered:
* commit c7cd871b (protected by 'HEAD') - contains 165 dirty files :
- Directions_api/Applications/LTDS/Cycling/Leisure/l__cyc.csv (147.3 KB)
- Directions_api/Applications/LTDS/Cycling/Work/w_cyc.csv (434.0 KB)
- ...
WARNING: The dirty content above may be removed from other commits, but as
the *protected* commits still use it, it will STILL exist in your repository.
If you *really* want this content gone, make a manual commit that removes it,
and then run the BFG on a fresh copy of your repo.
First of all - can someone explain why these commits are protected and different from those that BFG successfully changed?
Secondly - how can I unprotect these and allow BFG to edit them, thus allowing me to use LFS correctly and finally push successfully to GitHub