0

I have a very huge file (500GB) to upload, and to easy a bit this task I want to cut this file in pieces.

Unfortunately I don't have 500GB free space on my hard drive, so I would like a way to split the file in multiple parts, without needing to free 500GB.

I though about a script with dd to select data chunk and export it, but I don't know of a way to delete the read data from the original file.

I'm okay with a C code to compile, even if it can take 10 days to run.

Thanks !

dvkch
  • 1,079
  • 1
  • 12
  • 20
  • 1
    see http://stackoverflow.com/questions/7341481/split-file-in-place-is-it-possible – Elazar May 27 '13 at 19:45
  • nice thanks! but I am not very familiar with bash code, do you think it'll work with a 500GB file ? – dvkch May 27 '13 at 19:50
  • bash is just a glue between unix service executables. If it does that really in place, and it works on small files, it will work on any size (assuming there are no memory leaks there) – Elazar May 27 '13 at 19:55
  • 2
    Why do you want to split it? Just use a tool that supports resuming uploads, e.g. like this: http://dimitar.me/how-to-resume-partial-file-transfers/ – thejh May 27 '13 at 20:00
  • Because the upload will be done on multiple computers, some parts on a friends laptop, some on mine oversea. Unfortunately none is on the same network than the NAS, so I can't cut a part, copy to a laptop and so on. So I'd prefer cut the whole big file and deal with it. Plus: I originally wanted to store it on Bitcasa, and my transfer stopped at 140GB remaining.... no way to resume – dvkch May 27 '13 at 20:56

0 Answers0