13

I'd like to know if it's possible to minimize the download times of large files when using Git LFS.

specifically, the following scenarios:

  • keep files when switching branches
  • allow to get files from a different repository in the same network - (which would be faster than accessing the remote "master" server)

[I know git annex has better support for these features, but it's Windows support is problematic.]

Paulo Mattos
  • 18,845
  • 10
  • 77
  • 85
Ophir Yoktan
  • 8,149
  • 7
  • 58
  • 106

1 Answers1

12

To the best of my knowledge Git LFS does keep files when switching branches - its checksum based and holds all blobs locally under .git/lfs/objects once it retrieved them once.

As for pointing lfs to a different endpoint - that's already supported: in your .git/config you can modify the lfs url it points to:

[remote "origin"]
url = https://...<repo_url>
fetch = +refs/heads/*:refs/remotes/origin/*
lfsurl = "https://<another repo that's closer to you>"

Also there are several services that provide lfs support so you can keep the storage on your local corp network like Artifactory, GitHub Enterprise and Bitbucket, depending on what your usecase is.

You might find this issue's conversation helpful as well.

danf
  • 2,629
  • 21
  • 28