0

I am trying to find the total file size of an ftp server with the following script:

import os
import ftplib
from os.path import join, getsize

serveradd = ("an.ftpserver.co.uk")          # Define the ftp server

print("Logging in to "+serveradd+"\n")

top = os.getcwd()                                       # Define the current directory

filelist = []

for (root, dirs, files) in os.walk(top):   # Get list of files in current directory
    filelist.extend(files)
    break

dirname = os.path.relpath(".","..")                     # Get the current directory name

ftp = ftplib.FTP(serveradd)

try:                                                    # Log in to FTP server
    ftp.login("username", "password")
except:
    "Failed to log in..."

ftp.cwd('data/TVT/')                                    # CD into the TVT folder

print("Contents of "+serveradd+"/"+ftp.pwd()+":")

ftp.retrlines("LIST")                                   # List directories on FTP server

print("\n")

print(ftp.pwd())

print(serveradd+ftp.pwd())

size = 0

for (root, dirs, files) in os.walk(ftp.pwd()):
    for x in dirs:
        try:
            ftp.cwd(x)
            size += ftp.size(".")
            ftp.cwd("..")
        except ftplib.error_perm:
            pass
print(size)

Everything is working up until I try to use os.walk to find the list of directories on the FTP server, use ftp.cwd to go into each directory and add the total size to the variable "size".

When I call print(size) the result is 0, when it should be a positive integer.

Am I missing something with the combination of os.wallk and ftp.pwd?

JRG
  • 4,037
  • 3
  • 23
  • 34

1 Answers1

-1

Use this function to fetch the size of directory using ftp client.

def get_size_of_directory(ftp, directory):
    size = 0
    for name in ftp.nlst(directory):
        try:
            ftp.cwd(name)
            size += get_size_of_directory(name)
        except:
            ftp.voidcmd('TYPE I')
            size += ftp.size(name)
    return size

You can recursively call the get_size_of_directory for each directory you find in the directory Hope this helps !!

  • Hi Afroze, does this work with subdirectories in the root? – Matt Houchin Jul 30 '17 at 16:33
  • You can recursively call the get_size_of_directory for each directory you find in the directory. – Afroze Khan Jul 30 '17 at 16:34
  • I updated the function for the directories as well. – Afroze Khan Jul 30 '17 at 16:41
  • I read recently that recursion like this can cause a maximum recursion depth error in Python. I would think it would be a lot more likely if you're walking through an unknown amount of directories. – Cory Madden Jul 30 '17 at 16:44
  • In that case just avoid recursions on `.` and `..` directories. Another way is don't traverse already traversed directories and stay in the root path without changing the root path. We could do anything we want and solve a problem if we have one by changing the code and making it perfect. – Afroze Khan Jul 30 '17 at 16:51
  • Of course, I'm not the person wanting the code. But from what I understand is that calling the same function over and over again will trigger the maximum recursion depth error anyway, even if you're not repeatedly going over the same directories. A better solution, from what I understand, would be to use `while True` and `break`. Here, I found the post I read that talked about it: https://stackoverflow.com/a/23294659/4832296 – Cory Madden Jul 30 '17 at 16:54
  • 1
    You have copied code from https://stackoverflow.com/a/22093848/850848 without giving any credit to the source. - Moreover, you should not have answered a duplicate question anyway. You should have voted it to be closed as duplicate. – Martin Prikryl Jun 07 '18 at 07:04