2

I need to create a "turning table" platform. My server must be able to take a file from FTP A and send it to FTP B. I did a lot of file transfer systems, so I have no problem with ftplib, aspera, s3 and other transfer protocols.

The thing is that I have big files (150G) on FTP A. And many transfers will occur at the same time, from and to many FTP servers or other.

I don't want my platform to actually store these files in order to send them to another location. I don't want to load everything in memory either... I need to "stream" binary data from A to B, with minimal charge on my transfer platform.

I am looking at https://docs.python.org/2/library/io.html with ReadBuffer and WriteBuffer, but I can't find examples and the documentation is sorta cryptic for me...

Anyone has a starting point?

buff = io.open('/var/tmp/test', 'wb')

def loadbuff(data):
    buff.write(data)

self.ftp.retrbinary('RETR ' + name, loadbuff, blocksize=8)

So my data is coming in buff, which is a <_io.BufferedWriter name='/var/tmp/test'> object, but how can I start reading from it while ftplib keeps downloading?

Hope I'm clear enough, any idea is welcomed.

Thanks

LeSuspect
  • 45
  • 5

0 Answers0