-1

I am trying to write a loop to iterate over a very large file, I am writing this script in a linux vm with 4GB of ram so I can't load the whole file at once I need to read it in chunks of 1024 bytes (correct me if wrong please)

I can think of two ways to do this although the second one gets killed by zsh because the system simply runs out of ram

the first one keeps returning the same list of passwords

wordlist_handler = open('passwords.txt')

def read_wordlist(wordlist_handler):
    return wordlist_handler.read(1024)
    

def read_wordlist2(file='passwords.txt'): 
    with open(file , 'r') as f:
        return list(f)

my goal is to iterate over the file similar to the first method

if __name__ == "__main__":
    # this will return a list of n passwords 
    
    while True:
        lst = read_wordlist(wordlist_handler)
        
        # use lst in concurrent.futures     
   


Karl Knechtel
  • 62,466
  • 11
  • 102
  • 153
katysha
  • 51
  • 2
  • 7
  • IT is a clear text file - read it line by line - that way you only have one line in memory at any time.: `with open("passwords.txt") as f: for line in f: ....` do something with that line. Do NOT return it - use it and if it does not work you'll get the next line... – Patrick Artner Jul 17 '21 at 16:33
  • @PatrickArtner the goal is to use ``` concurrent.futures ``` single lines are usesless here i think – katysha Jul 17 '21 at 16:44

1 Answers1

-1

I have this but some of the passwords are 2 digits long

while True:
        x = load_chunks()
        print (len(x) , x )
        if len(x) <= 2 :
            break
katysha
  • 51
  • 2
  • 7
  • 1st not an answer, 2nd missing `load_chunks()`, 3rd why ask if you got a solution. If you want to add more to you question - [edit] it instead. – Patrick Artner Jul 17 '21 at 16:35