I want read csv file. It has 5 million rows and 13 columns. File size is 25GB. RAM server is 24GB.
df_list = []
chunksize = 100000
for chunk in pd.read_csv(path, chunksize=chunksize):
df_list.append(chunk)
X = pd.concat(df_list)
After running a moment, it stop and error.
I want to stop or something if memory / RAM is 20GB.