I have a Git repository in a shared folder on a Windows PC which I am accessing with an UNC path, e.g.
git clone //server/share/MyRepo.git
When I fetch changes from this repository over a VPN from home it takes a very long time for git-upload-pack.exe
to run. I realise that there is no server (as such) involved and my local PC is running all of the executables.
The name of git-upload-pack.exe
suggests to me that my local PC is reading files from the remote file share in order to upload them somewhere, but that would be to itself, which makes no sense. This in turn leads me to think that the fetch
is nowhere near as performant as it could be. It's like the local machine is doing all the work to cut down the data to transfer, but to do that it has to transfer all the data.
Can anyone shed some light on how this works? Is performance as good as possible without running a true Git server via SSH or whatever at the remote end, or are files being transferred back and forth unnecessarily?