I am trying to write a loop to iterate over a very large file, I am writing this script in a linux vm with 4GB of ram so I can't load the whole file at once I need to read it in chunks of 1024 bytes (correct me if wrong please)
I can think of two ways to do this although the second one gets killed by zsh
because the system simply runs out of ram
the first one keeps returning the same list of passwords
wordlist_handler = open('passwords.txt')
def read_wordlist(wordlist_handler):
return wordlist_handler.read(1024)
def read_wordlist2(file='passwords.txt'):
with open(file , 'r') as f:
return list(f)
my goal is to iterate over the file similar to the first method
if __name__ == "__main__":
# this will return a list of n passwords
while True:
lst = read_wordlist(wordlist_handler)
# use lst in concurrent.futures