1

I uploaded a video few months ago to a git repo on openshift. Luckily during that time the upload went good.

Now the issue is when I am trying to get that repo onto my local computer, it is timing out. I am trying to do a git clone xxxx.rhcloud.com but it keeps timing out hence I am not getting all my files to my local.

The video file is 400mb. I am not sure of a work around. I don't have branches just a master.

РАВИ
  • 11,467
  • 6
  • 31
  • 40

2 Answers2

2

Is there any reason you didn't used $OPENSHIFT_DATA_DIR for storing video files? The $OPENSHIFT_DATA_DIR is the recommended way to store large files in OpenShift. If you just need to get that file on your local machine, then you can use rhc scp command to download the file.

rhc scp --app <app_name> download ./ app-root/repo/<your_filename>.<extension>

Another option is you run rhc snapshot --save <app_name>. This will create export the current state of your OpenShift application into an archive on your local system.

Shekhar
  • 5,771
  • 10
  • 42
  • 48
1

Most Git repo hosting services report similar error when cloning a large repo (see BitBucket for instance, or GitLab one).

But the error can also comes the client.
Eclipse EGit needs special options to accomodate large files clone.
A git client itself can raise the values of some of its config parameters (provided the machine on which the git clone is running has enough memory).

More generally, git isn't the right referential for large binary file (see "git with large files")

Community
  • 1
  • 1
VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250