1

i use a script to copy files to a sharedrive. One of the files is usually around 3GB. The copy is quite slow, compare to manual copy. (all on Windows)

How can change the buffer size in my code ? Something similar like the answer in that question: Python copy larger file too slow

def copyAllFiles():
src_files = os.listdir(src)
for file_name in src_files:
    full_file_name = os.path.join(src, file_name)
    if (os.path.isfile(full_file_name)):
        shutil.copy(full_file_name, newPath)
        print("copy done")

or is there recommendation to use other copy method?

Aran-Fey
  • 39,665
  • 11
  • 104
  • 149
Mario S
  • 43
  • 6
  • did applying the answer of your linked question solve it for you? if not, why not? You just copy & paste [this code](https://stackoverflow.com/a/28584857/7505395) it in front of your code and that is it? – Patrick Artner May 23 '18 at 07:26
  • Sorry, no. I am a beginner. is he overriding the method? i thon he use copyfileobj while I am using copy. is that correct? – Mario S May 23 '18 at 07:30
  • 1
    He is _patching_ something - meaning he provides some other implementation for a function that `shutil` uses internally to copy stuff around. He then assigns his version to be used by shutil: `shutil.copyfileobj = _copyfileobj_patched` - after that every time shutil internally calls `copyfileobj` it will instead call his patched version `_copyfileobj` wich has a bigger default buffer - making things go smoother on windows. you can look into the sources but my guess would be that shutil.copy() internally uses the other one he patches. – Patrick Artner May 23 '18 at 07:57
  • 1
    see [monkey-patching-in-python-when-we-need-it](https://stackoverflow.com/questions/11977270/monkey-patching-in-python-when-we-need-it) – Patrick Artner May 23 '18 at 07:59
  • 1
    [(cpython) sources of shutil.py:](https://github.com/python/cpython/blob/master/Lib/shutil.py) – Patrick Artner May 23 '18 at 08:06

0 Answers0