0

i have 6-8GB binary files which needs to be transferred from one server to another server. At the same time once transfer completed and file fully downloaded to destination an event to be triggered.

Wondering if GIT is a good option to accomplish this

  • 1
    Why not scp, rsync, ftp,...? There is a not insignificant overhead when using git to transfer files. – EncryptedWatermelon Oct 18 '19 at 12:15
  • my thought is use git and have a post transfer hook to call a local service to perform some activities on the uploaded binary. i would like to achieve transfer -> an action automatically to be triggered once binary uploaded successfully. If i use ftp, scp i need to detect and take action by writing some monitoring tools which really i dont know if possible. – Madhavarao Kulkarni Oct 18 '19 at 16:41
  • From the destination computer couldn't you put it in script? Copy files then execute some command. Or are the file sent based on an event? Look at inotify. https://stackoverflow.com/questions/4062806/what-is-the-proper-way-to-use-inotify – EncryptedWatermelon Oct 18 '19 at 17:22

1 Answers1

2

You have two different questions, one in your subject line and one in your text body:

Can Git be used to transfer large (6+ GB) files across a network link?

Yes, it can.

Wondering if GIT is a good option to accomplish this

No, this is definitely not a good option. Its badness is somewhere between "somewhat bad" and "terrible", depending on how reliable your link is. If your link is extremely reliable, this is merely a somewhat-bad option. If your link is pretty unreliable, this is a terrible option, because Git transfers do not restart partway through. A commit either gets all the way across, or Git starts over from scratch.

Using rsync will enable you to transfer large files with restart. See https://unix.stackexchange.com/questions/48298/can-rsync-resume-after-being-interrupted.

torek
  • 448,244
  • 59
  • 642
  • 775