I have a program which parses a 100MB
file, then I apply some functions on the data. I didn't implement the functions to check the bottleneck...
So I just put my implementation in comment and just put pass
WHy is python using so much memory ?
It takes 15 minutes to parse the file and I can see python is using 3GB of memory, CPU is on 15% usage and Memory is on 70% usage.
Does it apply the program is io bound ?
How can I fasten the parsing ? Or isn't there anything to do against the slow parsing ?
File sample: Age and Salary
50 1000
40 123
1233 123213
CODE:
def parse(pathToFile):
myList = []
with open(pathToFile) as f:
for line in f:
s = line.split()
age, salary = [int(v) for v in s]
Jemand = Mensch(age, salary)
myList.append(Jemand)
return myList