Imagine we have a piece of code which cuts the large data into smaller data and do some process on it.
def node_cut(input_file):
NODE_LENGTH = 500
count_output = 0
node_list=[]
for line in input_file.readlines():
if len(node_list) >= NODE_LENGTH :
count_output += 1
return( node_list,count_output )
node_list=[]
node,t=line.split(',')
node_list.append(node)
if __name__ =='__main__':
input_data = open('all_nodes.txt','r')
node_list, count_output = node_cut(input_data)
some_process(node_list)
while node_cut return the first data list, the for loop stop going on for the rest of the large data. How I can make sure that it returns but still the loop continues?