I've read at various internet resources that Git is handling large files not very well, also, Git seems to have problems with large overall repository sizes. This seems to have initiated projects like git-annex, git-media, git-fat, git-bigfiles, and probably even more...
However, after reading Git-Internals it looks to me, like Git's pack file concept should solve all the problems with large files.
Q1: What's the fuss about large files in Git?
Q2: What's the fuss about Git and large repositories?
Q3: If we have a project with two binary dependencies (e.g. around 25 DLL files with each around 500KB to 1MB) which are updated on a monthly basis. Is this really going to be a problem for Git? Is only the initial cloning going to be a long process, or is working with the repository (e.g. branch change, commits, pulling, pushing, etc.) going to be everyday problem?